Oct 06 13:03:39 crc systemd[1]: Starting Kubernetes Kubelet... Oct 06 13:03:39 crc restorecon[4668]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:39 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 13:03:40 crc restorecon[4668]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 13:03:40 crc restorecon[4668]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 06 13:03:40 crc kubenswrapper[4867]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 13:03:40 crc kubenswrapper[4867]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 06 13:03:40 crc kubenswrapper[4867]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 13:03:40 crc kubenswrapper[4867]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 13:03:40 crc kubenswrapper[4867]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 06 13:03:40 crc kubenswrapper[4867]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.979011 4867 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985802 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985842 4867 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985855 4867 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985866 4867 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985877 4867 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985886 4867 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985896 4867 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985906 4867 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985915 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985924 4867 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985933 4867 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985946 4867 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985957 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985968 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985979 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.985991 4867 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986001 4867 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986009 4867 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986018 4867 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986026 4867 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986035 4867 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986043 4867 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986052 4867 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986060 4867 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986068 4867 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986077 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986086 4867 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986097 4867 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986107 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986118 4867 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986128 4867 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986138 4867 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986159 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986169 4867 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986179 4867 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986188 4867 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986197 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986207 4867 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986216 4867 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986226 4867 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986236 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986247 4867 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986286 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986297 4867 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986308 4867 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986319 4867 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986328 4867 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986337 4867 feature_gate.go:330] unrecognized feature gate: Example Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986346 4867 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986355 4867 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986365 4867 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986374 4867 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986383 4867 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986391 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986400 4867 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986409 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986418 4867 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986427 4867 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986435 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986443 4867 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986452 4867 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986460 4867 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986471 4867 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986479 4867 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986489 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986497 4867 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986506 4867 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986514 4867 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986523 4867 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986531 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.986539 4867 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986715 4867 flags.go:64] FLAG: --address="0.0.0.0" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986735 4867 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986752 4867 flags.go:64] FLAG: --anonymous-auth="true" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986765 4867 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986779 4867 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986789 4867 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986803 4867 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986818 4867 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986828 4867 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986839 4867 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986849 4867 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986860 4867 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986871 4867 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986881 4867 flags.go:64] FLAG: --cgroup-root="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986890 4867 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986900 4867 flags.go:64] FLAG: --client-ca-file="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986909 4867 flags.go:64] FLAG: --cloud-config="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986920 4867 flags.go:64] FLAG: --cloud-provider="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986929 4867 flags.go:64] FLAG: --cluster-dns="[]" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986941 4867 flags.go:64] FLAG: --cluster-domain="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986950 4867 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986960 4867 flags.go:64] FLAG: --config-dir="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986971 4867 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986982 4867 flags.go:64] FLAG: --container-log-max-files="5" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.986995 4867 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987007 4867 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987018 4867 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987028 4867 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987039 4867 flags.go:64] FLAG: --contention-profiling="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987049 4867 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987058 4867 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987068 4867 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987078 4867 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987090 4867 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987100 4867 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987110 4867 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987120 4867 flags.go:64] FLAG: --enable-load-reader="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987131 4867 flags.go:64] FLAG: --enable-server="true" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987142 4867 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987156 4867 flags.go:64] FLAG: --event-burst="100" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987166 4867 flags.go:64] FLAG: --event-qps="50" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987176 4867 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987186 4867 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987197 4867 flags.go:64] FLAG: --eviction-hard="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987210 4867 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987219 4867 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987229 4867 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987242 4867 flags.go:64] FLAG: --eviction-soft="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987279 4867 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987289 4867 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987301 4867 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987312 4867 flags.go:64] FLAG: --experimental-mounter-path="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987322 4867 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987332 4867 flags.go:64] FLAG: --fail-swap-on="true" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987342 4867 flags.go:64] FLAG: --feature-gates="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987354 4867 flags.go:64] FLAG: --file-check-frequency="20s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987364 4867 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987375 4867 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987385 4867 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987396 4867 flags.go:64] FLAG: --healthz-port="10248" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987406 4867 flags.go:64] FLAG: --help="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987417 4867 flags.go:64] FLAG: --hostname-override="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987427 4867 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987437 4867 flags.go:64] FLAG: --http-check-frequency="20s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987448 4867 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987457 4867 flags.go:64] FLAG: --image-credential-provider-config="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987467 4867 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987477 4867 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987486 4867 flags.go:64] FLAG: --image-service-endpoint="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987496 4867 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987506 4867 flags.go:64] FLAG: --kube-api-burst="100" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987516 4867 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987526 4867 flags.go:64] FLAG: --kube-api-qps="50" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987536 4867 flags.go:64] FLAG: --kube-reserved="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987546 4867 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987555 4867 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987565 4867 flags.go:64] FLAG: --kubelet-cgroups="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987575 4867 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987585 4867 flags.go:64] FLAG: --lock-file="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987596 4867 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987607 4867 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987617 4867 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987632 4867 flags.go:64] FLAG: --log-json-split-stream="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987642 4867 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987652 4867 flags.go:64] FLAG: --log-text-split-stream="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987662 4867 flags.go:64] FLAG: --logging-format="text" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987673 4867 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987684 4867 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987695 4867 flags.go:64] FLAG: --manifest-url="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987706 4867 flags.go:64] FLAG: --manifest-url-header="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987720 4867 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987730 4867 flags.go:64] FLAG: --max-open-files="1000000" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987743 4867 flags.go:64] FLAG: --max-pods="110" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987753 4867 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987763 4867 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987774 4867 flags.go:64] FLAG: --memory-manager-policy="None" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987784 4867 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987794 4867 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987804 4867 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987814 4867 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987836 4867 flags.go:64] FLAG: --node-status-max-images="50" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987846 4867 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987856 4867 flags.go:64] FLAG: --oom-score-adj="-999" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987866 4867 flags.go:64] FLAG: --pod-cidr="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987875 4867 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987892 4867 flags.go:64] FLAG: --pod-manifest-path="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987902 4867 flags.go:64] FLAG: --pod-max-pids="-1" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987912 4867 flags.go:64] FLAG: --pods-per-core="0" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987921 4867 flags.go:64] FLAG: --port="10250" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987932 4867 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987942 4867 flags.go:64] FLAG: --provider-id="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987952 4867 flags.go:64] FLAG: --qos-reserved="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987962 4867 flags.go:64] FLAG: --read-only-port="10255" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987972 4867 flags.go:64] FLAG: --register-node="true" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987982 4867 flags.go:64] FLAG: --register-schedulable="true" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.987993 4867 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988010 4867 flags.go:64] FLAG: --registry-burst="10" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988020 4867 flags.go:64] FLAG: --registry-qps="5" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988029 4867 flags.go:64] FLAG: --reserved-cpus="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988039 4867 flags.go:64] FLAG: --reserved-memory="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988052 4867 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988062 4867 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988073 4867 flags.go:64] FLAG: --rotate-certificates="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988082 4867 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988092 4867 flags.go:64] FLAG: --runonce="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988102 4867 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988112 4867 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988124 4867 flags.go:64] FLAG: --seccomp-default="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988134 4867 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988144 4867 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988154 4867 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988166 4867 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988176 4867 flags.go:64] FLAG: --storage-driver-password="root" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988188 4867 flags.go:64] FLAG: --storage-driver-secure="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988199 4867 flags.go:64] FLAG: --storage-driver-table="stats" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988210 4867 flags.go:64] FLAG: --storage-driver-user="root" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988220 4867 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988231 4867 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988241 4867 flags.go:64] FLAG: --system-cgroups="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988300 4867 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988318 4867 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988329 4867 flags.go:64] FLAG: --tls-cert-file="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988341 4867 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988354 4867 flags.go:64] FLAG: --tls-min-version="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988364 4867 flags.go:64] FLAG: --tls-private-key-file="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988374 4867 flags.go:64] FLAG: --topology-manager-policy="none" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988385 4867 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988395 4867 flags.go:64] FLAG: --topology-manager-scope="container" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988407 4867 flags.go:64] FLAG: --v="2" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988420 4867 flags.go:64] FLAG: --version="false" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988434 4867 flags.go:64] FLAG: --vmodule="" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988448 4867 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.988460 4867 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988683 4867 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988696 4867 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988705 4867 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988714 4867 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988723 4867 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988732 4867 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988741 4867 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988750 4867 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988759 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988767 4867 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988776 4867 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988785 4867 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988794 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988804 4867 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988813 4867 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988822 4867 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988831 4867 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988840 4867 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988849 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988858 4867 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988869 4867 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988877 4867 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988886 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988897 4867 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988909 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988920 4867 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988931 4867 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988941 4867 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988950 4867 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988959 4867 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988968 4867 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988976 4867 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.988987 4867 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989173 4867 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989222 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989228 4867 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989234 4867 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989239 4867 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989244 4867 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989290 4867 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989295 4867 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989299 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989303 4867 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989308 4867 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989312 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989316 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989320 4867 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989324 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989328 4867 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989331 4867 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989336 4867 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989343 4867 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989349 4867 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989354 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989358 4867 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989361 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989365 4867 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989369 4867 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989373 4867 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989376 4867 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989380 4867 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989384 4867 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989388 4867 feature_gate.go:330] unrecognized feature gate: Example Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989392 4867 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989395 4867 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989407 4867 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989413 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989417 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989421 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989424 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.989429 4867 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.989452 4867 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.999439 4867 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 06 13:03:40 crc kubenswrapper[4867]: I1006 13:03:40.999495 4867 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999570 4867 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999580 4867 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999585 4867 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999590 4867 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999595 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999599 4867 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999604 4867 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999609 4867 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999614 4867 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999618 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999624 4867 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999629 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999635 4867 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999639 4867 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999644 4867 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999648 4867 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999653 4867 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999661 4867 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999666 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999671 4867 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999675 4867 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999681 4867 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999685 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999689 4867 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999693 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999697 4867 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999701 4867 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999705 4867 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999710 4867 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999715 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999720 4867 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 13:03:40 crc kubenswrapper[4867]: W1006 13:03:40.999726 4867 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999730 4867 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999734 4867 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999740 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999744 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999748 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999752 4867 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999756 4867 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999760 4867 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999763 4867 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999767 4867 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999771 4867 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999776 4867 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999780 4867 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999784 4867 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999790 4867 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999794 4867 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999798 4867 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999802 4867 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999805 4867 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999809 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999812 4867 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999816 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999820 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999824 4867 feature_gate.go:330] unrecognized feature gate: Example Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999827 4867 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999831 4867 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999834 4867 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999838 4867 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999841 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999845 4867 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999849 4867 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999854 4867 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999859 4867 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999863 4867 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999868 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999872 4867 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999876 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999880 4867 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:40.999884 4867 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:40.999892 4867 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000018 4867 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000030 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000035 4867 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000039 4867 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000042 4867 feature_gate.go:330] unrecognized feature gate: Example Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000046 4867 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000051 4867 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000058 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000063 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000068 4867 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000072 4867 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000076 4867 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000080 4867 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000084 4867 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000088 4867 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000092 4867 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000096 4867 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000100 4867 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000103 4867 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000107 4867 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000111 4867 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000115 4867 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000118 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000122 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000126 4867 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000129 4867 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000133 4867 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000137 4867 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000140 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000144 4867 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000148 4867 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000152 4867 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000156 4867 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000160 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000166 4867 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000170 4867 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000174 4867 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000178 4867 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000182 4867 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000186 4867 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000190 4867 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000195 4867 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000199 4867 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000203 4867 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000207 4867 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000211 4867 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000214 4867 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000218 4867 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000221 4867 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000225 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000228 4867 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000231 4867 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000235 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000240 4867 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000244 4867 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000248 4867 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000306 4867 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000310 4867 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000313 4867 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000317 4867 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000321 4867 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000325 4867 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000328 4867 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000332 4867 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000336 4867 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000340 4867 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000343 4867 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000347 4867 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000351 4867 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000355 4867 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.000360 4867 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.000366 4867 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.000562 4867 server.go:940] "Client rotation is on, will bootstrap in background" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.004390 4867 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.004486 4867 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.005897 4867 server.go:997] "Starting client certificate rotation" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.005924 4867 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.007647 4867 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-01 22:58:16.87414038 +0000 UTC Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.007793 4867 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2097h54m35.866351363s for next certificate rotation Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.031712 4867 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.034105 4867 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.054579 4867 log.go:25] "Validated CRI v1 runtime API" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.092965 4867 log.go:25] "Validated CRI v1 image API" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.094934 4867 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.102522 4867 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-06-12-58-48-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.102577 4867 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.132988 4867 manager.go:217] Machine: {Timestamp:2025-10-06 13:03:41.128526947 +0000 UTC m=+0.586475171 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:bd3761ce-1fa1-4021-80de-a06d0f4530ae BootID:f975b18a-cd36-4c7e-a04c-71b0f488ca5c Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:7b:2e:1a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:7b:2e:1a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a2:48:37 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:37:86:a8 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:8f:e6:3f Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ec:81:45 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:6e:b0:f5:32:8b:5f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:4e:82:db:f4:91:dd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.133427 4867 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.133742 4867 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.135603 4867 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.136079 4867 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.136218 4867 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.136664 4867 topology_manager.go:138] "Creating topology manager with none policy" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.136688 4867 container_manager_linux.go:303] "Creating device plugin manager" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.137450 4867 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.137486 4867 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.138368 4867 state_mem.go:36] "Initialized new in-memory state store" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.138515 4867 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.142941 4867 kubelet.go:418] "Attempting to sync node with API server" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.142994 4867 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.143027 4867 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.143052 4867 kubelet.go:324] "Adding apiserver pod source" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.143072 4867 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.147590 4867 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.148735 4867 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.149993 4867 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.151961 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.151990 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.152004 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.152014 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.151925 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.151923 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.152032 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.152232 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 06 13:03:41 crc kubenswrapper[4867]: E1006 13:03:41.152147 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.152282 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.152315 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.152335 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.152349 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 06 13:03:41 crc kubenswrapper[4867]: E1006 13:03:41.152147 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.152387 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.152406 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.153545 4867 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.154384 4867 server.go:1280] "Started kubelet" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.155229 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.155308 4867 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.155308 4867 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.155919 4867 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 06 13:03:41 crc systemd[1]: Started Kubernetes Kubelet. Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.167562 4867 server.go:460] "Adding debug handlers to kubelet server" Oct 06 13:03:41 crc kubenswrapper[4867]: E1006 13:03:41.166586 4867 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.198:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186be89883546f2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-06 13:03:41.154332458 +0000 UTC m=+0.612280642,LastTimestamp:2025-10-06 13:03:41.154332458 +0000 UTC m=+0.612280642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.169229 4867 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.169275 4867 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.169494 4867 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 21:12:13.348344339 +0000 UTC Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.169604 4867 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1112h8m32.17874304s for next certificate rotation Oct 06 13:03:41 crc kubenswrapper[4867]: E1006 13:03:41.169521 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.169595 4867 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.169570 4867 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.169863 4867 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 06 13:03:41 crc kubenswrapper[4867]: E1006 13:03:41.170009 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="200ms" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.170565 4867 factory.go:55] Registering systemd factory Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.170605 4867 factory.go:221] Registration of the systemd container factory successfully Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.170521 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Oct 06 13:03:41 crc kubenswrapper[4867]: E1006 13:03:41.170713 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.172838 4867 factory.go:153] Registering CRI-O factory Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.172875 4867 factory.go:221] Registration of the crio container factory successfully Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.172958 4867 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.172988 4867 factory.go:103] Registering Raw factory Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.173009 4867 manager.go:1196] Started watching for new ooms in manager Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.174398 4867 manager.go:319] Starting recovery of all containers Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.176806 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.176888 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.176900 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.176910 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.176921 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.176930 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.176939 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.176948 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.176957 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.176965 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.176976 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.176985 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.177017 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.177029 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.177038 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179410 4867 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179489 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179514 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179530 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179545 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179581 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179597 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179610 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179625 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179664 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179678 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179702 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179754 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179770 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179786 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179825 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179844 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179862 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179875 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179925 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179943 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179956 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.179996 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180021 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180041 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180125 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180148 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180188 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180203 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180218 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180237 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180291 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180313 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180330 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180344 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180360 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180399 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180414 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180437 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180451 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180465 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180481 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180494 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180570 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180587 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180601 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180614 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180626 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180642 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180656 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180671 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180689 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180841 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180859 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180873 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180885 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180898 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180918 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180933 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180945 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180959 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180974 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.180987 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181001 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181017 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181030 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181047 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181069 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181088 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181104 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181118 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181132 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181152 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181171 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181212 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181229 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181243 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181269 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181282 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181294 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181308 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181321 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181333 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181345 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181357 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181370 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181382 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181395 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181408 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181422 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181444 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181463 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181479 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181492 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181508 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181522 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181538 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181552 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181565 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181578 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181591 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181605 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181618 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181632 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181657 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181671 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181682 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181696 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181711 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181724 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181736 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181749 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181779 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181792 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181805 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181819 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181833 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181845 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181858 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.181872 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182102 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182120 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182131 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182144 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182159 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182175 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182192 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182208 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182223 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182237 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182268 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182281 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182295 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182307 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182320 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182332 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182343 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182355 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182368 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182382 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182393 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182406 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182421 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182434 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182447 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182459 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182472 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182484 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182495 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182508 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182521 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182533 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182545 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182562 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182576 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182590 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182603 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182616 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182636 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182647 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182661 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182674 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182685 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182697 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182709 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182721 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182734 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182747 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182758 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182771 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182782 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182841 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182854 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.182866 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183122 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183162 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183179 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183281 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183298 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183322 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183340 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183356 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183378 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183411 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183537 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183563 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183578 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183902 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183922 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183939 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.183961 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.184060 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.184092 4867 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.184120 4867 reconstruct.go:97] "Volume reconstruction finished" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.184130 4867 reconciler.go:26] "Reconciler: start to sync state" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.197971 4867 manager.go:324] Recovery completed Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.207669 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.210213 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.210268 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.210279 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.210898 4867 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.210912 4867 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.210936 4867 state_mem.go:36] "Initialized new in-memory state store" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.216272 4867 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.219290 4867 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.219485 4867 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.219934 4867 kubelet.go:2335] "Starting kubelet main sync loop" Oct 06 13:03:41 crc kubenswrapper[4867]: E1006 13:03:41.220063 4867 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.221794 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Oct 06 13:03:41 crc kubenswrapper[4867]: E1006 13:03:41.221875 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.235393 4867 policy_none.go:49] "None policy: Start" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.236551 4867 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.236580 4867 state_mem.go:35] "Initializing new in-memory state store" Oct 06 13:03:41 crc kubenswrapper[4867]: E1006 13:03:41.269993 4867 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.304662 4867 manager.go:334] "Starting Device Plugin manager" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.304745 4867 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.304772 4867 server.go:79] "Starting device plugin registration server" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.305231 4867 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.305284 4867 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.305528 4867 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.305607 4867 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.305615 4867 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 06 13:03:41 crc kubenswrapper[4867]: E1006 13:03:41.318643 4867 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.320794 4867 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.320925 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.322896 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.322960 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.322969 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.323171 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.323328 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.323370 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.324073 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.324101 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.324111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.324286 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.324327 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.324345 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.324515 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.324632 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.324668 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.325734 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.325741 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.325766 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.325782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.325789 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.325805 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.325900 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.326055 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.326112 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.327225 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.327245 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.327271 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.327294 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.327313 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.327326 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.327419 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.327553 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.327597 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.328153 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.328185 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.328193 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.328384 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.328409 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.328444 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.328454 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.328445 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.329887 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.329923 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.329934 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:41 crc kubenswrapper[4867]: E1006 13:03:41.371401 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="400ms" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.387671 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.387724 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.387750 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.387775 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.387802 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.387823 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.387882 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.387924 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.387951 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.387975 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.387994 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.388016 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.388052 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.388088 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.388109 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.405803 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.407480 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.407526 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.407537 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.407569 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 13:03:41 crc kubenswrapper[4867]: E1006 13:03:41.408076 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.198:6443: connect: connection refused" node="crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.489726 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.489787 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.489935 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.489946 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490043 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.489810 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490133 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490157 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490177 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490194 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490214 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490231 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490276 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490297 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490316 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490370 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490390 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490407 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490408 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490406 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490408 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490429 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490466 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490508 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490528 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490534 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490535 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490581 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490641 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.490586 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.609204 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.610223 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.610270 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.610279 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.610304 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 13:03:41 crc kubenswrapper[4867]: E1006 13:03:41.610722 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.198:6443: connect: connection refused" node="crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.647102 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.654993 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.674905 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.684906 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: I1006 13:03:41.687729 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.692488 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-bd776daef6f98f061b287d0595001438e5242bde66ecdf4b85933e50bd7e5a14 WatchSource:0}: Error finding container bd776daef6f98f061b287d0595001438e5242bde66ecdf4b85933e50bd7e5a14: Status 404 returned error can't find the container with id bd776daef6f98f061b287d0595001438e5242bde66ecdf4b85933e50bd7e5a14 Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.692732 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-675280ea486a3f7c7c14f3d19d7abd72d505a2fbf43decbd8c36a86c07dbad06 WatchSource:0}: Error finding container 675280ea486a3f7c7c14f3d19d7abd72d505a2fbf43decbd8c36a86c07dbad06: Status 404 returned error can't find the container with id 675280ea486a3f7c7c14f3d19d7abd72d505a2fbf43decbd8c36a86c07dbad06 Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.697794 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-58f85eec11e41e3b79cf9e041aa2378ba643107ef39cedef1c452b6251a052f2 WatchSource:0}: Error finding container 58f85eec11e41e3b79cf9e041aa2378ba643107ef39cedef1c452b6251a052f2: Status 404 returned error can't find the container with id 58f85eec11e41e3b79cf9e041aa2378ba643107ef39cedef1c452b6251a052f2 Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.706683 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-7f01a9f009872bd9f944d7f23469bd5eaab5eb5cc3df34c1126cd095b16e9f93 WatchSource:0}: Error finding container 7f01a9f009872bd9f944d7f23469bd5eaab5eb5cc3df34c1126cd095b16e9f93: Status 404 returned error can't find the container with id 7f01a9f009872bd9f944d7f23469bd5eaab5eb5cc3df34c1126cd095b16e9f93 Oct 06 13:03:41 crc kubenswrapper[4867]: W1006 13:03:41.707912 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-58fcc2d35c077b0924bc4fadba4cc726ee90c56a746fdd0e18a3eef1f9a4c004 WatchSource:0}: Error finding container 58fcc2d35c077b0924bc4fadba4cc726ee90c56a746fdd0e18a3eef1f9a4c004: Status 404 returned error can't find the container with id 58fcc2d35c077b0924bc4fadba4cc726ee90c56a746fdd0e18a3eef1f9a4c004 Oct 06 13:03:41 crc kubenswrapper[4867]: E1006 13:03:41.772658 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="800ms" Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.010804 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.012314 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.012345 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.012353 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.012373 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 13:03:42 crc kubenswrapper[4867]: E1006 13:03:42.012649 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.198:6443: connect: connection refused" node="crc" Oct 06 13:03:42 crc kubenswrapper[4867]: W1006 13:03:42.012654 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Oct 06 13:03:42 crc kubenswrapper[4867]: E1006 13:03:42.012704 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Oct 06 13:03:42 crc kubenswrapper[4867]: W1006 13:03:42.056442 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Oct 06 13:03:42 crc kubenswrapper[4867]: E1006 13:03:42.056483 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.156504 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.227142 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"58f85eec11e41e3b79cf9e041aa2378ba643107ef39cedef1c452b6251a052f2"} Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.229452 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"675280ea486a3f7c7c14f3d19d7abd72d505a2fbf43decbd8c36a86c07dbad06"} Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.230479 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bd776daef6f98f061b287d0595001438e5242bde66ecdf4b85933e50bd7e5a14"} Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.233634 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"58fcc2d35c077b0924bc4fadba4cc726ee90c56a746fdd0e18a3eef1f9a4c004"} Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.234921 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7f01a9f009872bd9f944d7f23469bd5eaab5eb5cc3df34c1126cd095b16e9f93"} Oct 06 13:03:42 crc kubenswrapper[4867]: W1006 13:03:42.272653 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Oct 06 13:03:42 crc kubenswrapper[4867]: E1006 13:03:42.272741 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Oct 06 13:03:42 crc kubenswrapper[4867]: W1006 13:03:42.373843 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Oct 06 13:03:42 crc kubenswrapper[4867]: E1006 13:03:42.373917 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Oct 06 13:03:42 crc kubenswrapper[4867]: E1006 13:03:42.573962 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="1.6s" Oct 06 13:03:42 crc kubenswrapper[4867]: E1006 13:03:42.620299 4867 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.198:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186be89883546f2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-06 13:03:41.154332458 +0000 UTC m=+0.612280642,LastTimestamp:2025-10-06 13:03:41.154332458 +0000 UTC m=+0.612280642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.813451 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.814758 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.814791 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.814800 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:42 crc kubenswrapper[4867]: I1006 13:03:42.814822 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 13:03:42 crc kubenswrapper[4867]: E1006 13:03:42.815220 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.198:6443: connect: connection refused" node="crc" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.156549 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.240580 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81"} Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.240642 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.240647 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666"} Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.240664 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d"} Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.240676 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42"} Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.241747 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.241778 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.241798 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.242566 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254" exitCode=0 Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.242657 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254"} Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.242730 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.243367 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.243392 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.243424 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.244913 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.245523 4867 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe" exitCode=0 Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.245612 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe"} Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.245742 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.246345 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.246377 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.246390 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.247187 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.247208 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.247218 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.252895 4867 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b" exitCode=0 Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.252954 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b"} Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.253056 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.254067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.254087 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.254096 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.255522 4867 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="52bb29e16b6159a854911ba772bb70664326cf3b8151e9f157df0e5c09a3dac8" exitCode=0 Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.255554 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"52bb29e16b6159a854911ba772bb70664326cf3b8151e9f157df0e5c09a3dac8"} Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.255626 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.259500 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.259556 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.259569 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:43 crc kubenswrapper[4867]: I1006 13:03:43.779908 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:03:44 crc kubenswrapper[4867]: W1006 13:03:44.059198 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Oct 06 13:03:44 crc kubenswrapper[4867]: E1006 13:03:44.059331 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Oct 06 13:03:44 crc kubenswrapper[4867]: W1006 13:03:44.062647 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Oct 06 13:03:44 crc kubenswrapper[4867]: E1006 13:03:44.062800 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.198:6443: connect: connection refused" logger="UnhandledError" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.156430 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.198:6443: connect: connection refused Oct 06 13:03:44 crc kubenswrapper[4867]: E1006 13:03:44.175398 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.198:6443: connect: connection refused" interval="3.2s" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.260357 4867 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9" exitCode=0 Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.260475 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9"} Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.260507 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.263265 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.263310 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.263331 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.266035 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"195d38cd77f948a851f2f1d0343b56091b81045e48249f91f7c2ee086f4aa430"} Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.266220 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.267724 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.267757 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.267768 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.273690 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8"} Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.273737 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5"} Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.273756 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63"} Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.273766 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464"} Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.273777 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097"} Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.273851 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.274993 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.275052 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.275075 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.276320 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"166367864dd2a7f9fc01cb33d0bc921e345533200879fd32f85edcf85d1763c2"} Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.276366 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2df801e51e6fb3509c31d3ed26436fef886ed99e5b5ec1589d83646466d76e49"} Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.276378 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6b57599ded487b78c1787aa529a0d4d2d6b684c445eb626d9efe0ddcdab321b1"} Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.276380 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.276409 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.277601 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.277632 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.277642 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.279625 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.279649 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.279659 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.415767 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.417479 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.417514 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.417522 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:44 crc kubenswrapper[4867]: I1006 13:03:44.417544 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 13:03:44 crc kubenswrapper[4867]: E1006 13:03:44.417964 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.198:6443: connect: connection refused" node="crc" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.280703 4867 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723" exitCode=0 Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.280770 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.280797 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.280817 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.280825 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.280843 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.281362 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723"} Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.281402 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.281425 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.281788 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.281810 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.281818 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.281807 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.281861 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.281873 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.281938 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.281957 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.281967 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.282794 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.282913 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.282944 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.282831 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.282985 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:45 crc kubenswrapper[4867]: I1006 13:03:45.282996 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.272381 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.288383 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0"} Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.288500 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4"} Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.288446 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.288540 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6"} Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.288570 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265"} Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.289523 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.289561 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.289574 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.380713 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.380893 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.380931 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.382158 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.382198 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.382207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.780377 4867 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 13:03:46 crc kubenswrapper[4867]: I1006 13:03:46.780782 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.139821 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.140118 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.141763 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.141870 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.141932 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.284551 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.296499 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9"} Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.296563 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.296629 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.297221 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.298858 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.298923 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.298942 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.299683 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.299753 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.299779 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.618732 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.621552 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.621611 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.621625 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.621663 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 13:03:47 crc kubenswrapper[4867]: I1006 13:03:47.938122 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.299861 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.300033 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.300901 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.300925 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.300933 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.301272 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.301495 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.301508 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.450359 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.450548 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.451989 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.452040 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.452052 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.455000 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:03:48 crc kubenswrapper[4867]: I1006 13:03:48.768971 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 06 13:03:49 crc kubenswrapper[4867]: I1006 13:03:49.301531 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:49 crc kubenswrapper[4867]: I1006 13:03:49.301581 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:49 crc kubenswrapper[4867]: I1006 13:03:49.302722 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:49 crc kubenswrapper[4867]: I1006 13:03:49.302747 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:49 crc kubenswrapper[4867]: I1006 13:03:49.302757 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:49 crc kubenswrapper[4867]: I1006 13:03:49.302935 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:49 crc kubenswrapper[4867]: I1006 13:03:49.302954 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:49 crc kubenswrapper[4867]: I1006 13:03:49.302963 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:50 crc kubenswrapper[4867]: I1006 13:03:50.548756 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 06 13:03:50 crc kubenswrapper[4867]: I1006 13:03:50.548950 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:50 crc kubenswrapper[4867]: I1006 13:03:50.550195 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:50 crc kubenswrapper[4867]: I1006 13:03:50.550266 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:50 crc kubenswrapper[4867]: I1006 13:03:50.550278 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:51 crc kubenswrapper[4867]: E1006 13:03:51.319749 4867 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 13:03:51 crc kubenswrapper[4867]: I1006 13:03:51.459400 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:03:51 crc kubenswrapper[4867]: I1006 13:03:51.459590 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:51 crc kubenswrapper[4867]: I1006 13:03:51.461509 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:51 crc kubenswrapper[4867]: I1006 13:03:51.461555 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:51 crc kubenswrapper[4867]: I1006 13:03:51.461568 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:03:55 crc kubenswrapper[4867]: W1006 13:03:55.043304 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 06 13:03:55 crc kubenswrapper[4867]: I1006 13:03:55.043400 4867 trace.go:236] Trace[367809537]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 13:03:45.041) (total time: 10001ms): Oct 06 13:03:55 crc kubenswrapper[4867]: Trace[367809537]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:03:55.043) Oct 06 13:03:55 crc kubenswrapper[4867]: Trace[367809537]: [10.001628083s] [10.001628083s] END Oct 06 13:03:55 crc kubenswrapper[4867]: E1006 13:03:55.043428 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 06 13:03:55 crc kubenswrapper[4867]: I1006 13:03:55.157488 4867 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 06 13:03:55 crc kubenswrapper[4867]: W1006 13:03:55.199478 4867 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 06 13:03:55 crc kubenswrapper[4867]: I1006 13:03:55.199656 4867 trace.go:236] Trace[932852529]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 13:03:45.198) (total time: 10001ms): Oct 06 13:03:55 crc kubenswrapper[4867]: Trace[932852529]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:03:55.199) Oct 06 13:03:55 crc kubenswrapper[4867]: Trace[932852529]: [10.001558985s] [10.001558985s] END Oct 06 13:03:55 crc kubenswrapper[4867]: E1006 13:03:55.199697 4867 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 06 13:03:55 crc kubenswrapper[4867]: I1006 13:03:55.404776 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 06 13:03:55 crc kubenswrapper[4867]: I1006 13:03:55.404836 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 06 13:03:55 crc kubenswrapper[4867]: I1006 13:03:55.409209 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 06 13:03:55 crc kubenswrapper[4867]: I1006 13:03:55.409312 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 06 13:03:56 crc kubenswrapper[4867]: I1006 13:03:56.393402 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]log ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]etcd ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/generic-apiserver-start-informers ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/priority-and-fairness-filter ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/start-apiextensions-informers ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/start-apiextensions-controllers ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/crd-informer-synced ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/start-system-namespaces-controller ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 06 13:03:56 crc kubenswrapper[4867]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 06 13:03:56 crc kubenswrapper[4867]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/bootstrap-controller ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/start-kube-aggregator-informers ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/apiservice-registration-controller ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/apiservice-discovery-controller ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]autoregister-completion ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/apiservice-openapi-controller ok Oct 06 13:03:56 crc kubenswrapper[4867]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 06 13:03:56 crc kubenswrapper[4867]: livez check failed Oct 06 13:03:56 crc kubenswrapper[4867]: I1006 13:03:56.393521 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:03:56 crc kubenswrapper[4867]: I1006 13:03:56.780388 4867 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 13:03:56 crc kubenswrapper[4867]: I1006 13:03:56.780515 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 06 13:03:57 crc kubenswrapper[4867]: I1006 13:03:57.144138 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:03:57 crc kubenswrapper[4867]: I1006 13:03:57.144294 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:03:57 crc kubenswrapper[4867]: I1006 13:03:57.145243 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:03:57 crc kubenswrapper[4867]: I1006 13:03:57.145295 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:03:57 crc kubenswrapper[4867]: I1006 13:03:57.145306 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:00 crc kubenswrapper[4867]: E1006 13:04:00.401346 4867 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 06 13:04:00 crc kubenswrapper[4867]: I1006 13:04:00.405085 4867 trace.go:236] Trace[891328560]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 13:03:48.903) (total time: 11501ms): Oct 06 13:04:00 crc kubenswrapper[4867]: Trace[891328560]: ---"Objects listed" error: 11501ms (13:04:00.404) Oct 06 13:04:00 crc kubenswrapper[4867]: Trace[891328560]: [11.501255547s] [11.501255547s] END Oct 06 13:04:00 crc kubenswrapper[4867]: I1006 13:04:00.405115 4867 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 06 13:04:00 crc kubenswrapper[4867]: I1006 13:04:00.407304 4867 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 06 13:04:00 crc kubenswrapper[4867]: E1006 13:04:00.408405 4867 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 06 13:04:00 crc kubenswrapper[4867]: I1006 13:04:00.409302 4867 trace.go:236] Trace[1625558882]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 13:03:47.674) (total time: 12734ms): Oct 06 13:04:00 crc kubenswrapper[4867]: Trace[1625558882]: ---"Objects listed" error: 12734ms (13:04:00.409) Oct 06 13:04:00 crc kubenswrapper[4867]: Trace[1625558882]: [12.734257393s] [12.734257393s] END Oct 06 13:04:00 crc kubenswrapper[4867]: I1006 13:04:00.409325 4867 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 06 13:04:00 crc kubenswrapper[4867]: I1006 13:04:00.466975 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Oct 06 13:04:00 crc kubenswrapper[4867]: I1006 13:04:00.467028 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Oct 06 13:04:00 crc kubenswrapper[4867]: I1006 13:04:00.466977 4867 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Oct 06 13:04:00 crc kubenswrapper[4867]: I1006 13:04:00.467080 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Oct 06 13:04:00 crc kubenswrapper[4867]: I1006 13:04:00.477534 4867 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 06 13:04:00 crc kubenswrapper[4867]: I1006 13:04:00.573787 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 06 13:04:00 crc kubenswrapper[4867]: I1006 13:04:00.590724 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.155048 4867 apiserver.go:52] "Watching apiserver" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.159920 4867 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.160212 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.160545 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.160582 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.160633 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.160663 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.160818 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.161103 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.161131 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.161392 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.161460 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.166930 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.167010 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.166933 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.167130 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.167144 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.167573 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.167744 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.167750 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.168832 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.171447 4867 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.185838 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.202382 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.211136 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.211391 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.211465 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.211530 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.211602 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.211672 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.211822 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.211814 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.211881 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.211906 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.211998 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212029 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212054 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212070 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212088 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212140 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212191 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212238 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212280 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212300 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212319 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212364 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212389 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212410 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212426 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212489 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212521 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212597 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212591 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212624 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212684 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212773 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212780 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212788 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212849 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212866 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212917 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.212999 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.213180 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.213336 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.213388 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.213419 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.213470 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.213598 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.213654 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.213677 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.213602 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.213854 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.213906 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.213935 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.213951 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.213967 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.214007 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.214146 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.214354 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.214522 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.214673 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.214718 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.214738 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.214744 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.214964 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215010 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.214929 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215239 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215328 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215362 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215392 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215558 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215574 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215600 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215618 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215638 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215654 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215705 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215722 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215741 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215763 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215766 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215779 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215798 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215819 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215836 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215855 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215871 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215888 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215908 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215932 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215980 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216004 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216027 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216050 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216072 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216116 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216144 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216168 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.217002 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.215910 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216126 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216324 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216442 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216510 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216573 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216622 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216642 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216739 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.217397 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.216914 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.217202 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.217429 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.217426 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.217285 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.217680 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.217688 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.217772 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.217906 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.217945 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.217974 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.217992 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218022 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218040 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218057 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218075 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218092 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218110 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218128 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218145 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218174 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218191 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218208 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218226 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218241 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218281 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218308 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218326 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218340 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218358 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218375 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218394 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218411 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218431 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218449 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218466 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218484 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218500 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218517 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218760 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218791 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218820 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218822 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218849 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218873 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218896 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218923 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218949 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218972 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219000 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219027 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219057 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219085 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219122 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219146 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219173 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219198 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219223 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219246 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219290 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219314 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219339 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219366 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219392 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219416 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219440 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219463 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219488 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219514 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219543 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219568 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219596 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219622 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219648 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219668 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219685 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219702 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219719 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219737 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219754 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219776 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219831 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219864 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219886 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219906 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219923 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219940 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219956 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219973 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219989 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220004 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220022 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220038 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220055 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220072 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220089 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220106 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220121 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220142 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220161 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220178 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220195 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220214 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220229 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220341 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220359 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220376 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220392 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220408 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220424 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220442 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220458 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220475 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220503 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220521 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220542 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220560 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220578 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220596 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220614 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220632 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220648 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220666 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220685 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220702 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220720 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220736 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220752 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220768 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220785 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220801 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220845 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220862 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220877 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220893 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220908 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220926 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220946 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220962 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220979 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.220995 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.221010 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.221027 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222031 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222085 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222115 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222149 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222190 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222208 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222232 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222273 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222300 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222323 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222347 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222375 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222402 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222424 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222515 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222533 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222549 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222563 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222576 4867 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222590 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.218893 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219097 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219045 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219155 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219342 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219630 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.219716 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222984 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226036 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226053 4867 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226067 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226082 4867 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226096 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226108 4867 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226122 4867 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226135 4867 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226149 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226162 4867 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226175 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226188 4867 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226201 4867 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226214 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226226 4867 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226238 4867 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226267 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226295 4867 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226311 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226324 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226336 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226349 4867 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.221222 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.221270 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.221270 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.221245 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.221465 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.221405 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.221690 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.221730 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.221827 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.221908 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222053 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222439 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222607 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222628 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.222803 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.223329 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.223597 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.223455 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.223860 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.223960 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.223975 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.224191 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.224382 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.224491 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.224501 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.224617 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.224799 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.224816 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.224875 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.224901 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.225031 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.225054 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.225266 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226742 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.225243 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226771 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.225412 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.225569 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.225619 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.225586 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.223169 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.227061 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.227156 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.227227 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.227406 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.227452 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.227525 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.227695 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.227842 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.227866 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.228014 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.228052 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.228177 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:04:01.728155619 +0000 UTC m=+21.186103763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.228439 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.228595 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.228739 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.228972 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.229205 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.226673 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.229340 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.229398 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.229612 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.229624 4867 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.229820 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.229958 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.230017 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.230134 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:01.730121366 +0000 UTC m=+21.188069510 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.230246 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.232244 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.238779 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.238917 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.238937 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.239066 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.239100 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.239293 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.239558 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.239570 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.239634 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.239661 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.239804 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.239812 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.239954 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.240026 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.240151 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.240272 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.240332 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.240349 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.240902 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.241337 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.241699 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.241968 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.253969 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.254053 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:01.754034563 +0000 UTC m=+21.211982707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.255840 4867 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.255997 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.256337 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.256501 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.256953 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.257052 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.257604 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.258284 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.259501 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.260216 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.261138 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.261764 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.262201 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.263291 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.263419 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.263430 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.263720 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.263819 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.263951 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.264173 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.264207 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.264495 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.264555 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.264592 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.264795 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.264884 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.264895 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.265113 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.265508 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.265538 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.278121 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.278138 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.276661 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.276930 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.278017 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.278301 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.278504 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.280345 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.282528 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.282673 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.282970 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.283244 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.283447 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.283433 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.285450 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.285879 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.286734 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.286953 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.287777 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.288732 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:01.788702518 +0000 UTC m=+21.246650652 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.265505 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.290983 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.292232 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.293498 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.293523 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.294151 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:01.794124908 +0000 UTC m=+21.252073052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.294905 4867 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.294945 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.294956 4867 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.294966 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.294976 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.294989 4867 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.294998 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295010 4867 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295023 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295039 4867 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295051 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295062 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295075 4867 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295086 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295098 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295109 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295124 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295143 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295154 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295165 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295179 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.295409 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.296447 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.296642 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.296476 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.296829 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.297238 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.298575 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.298786 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.299195 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.300353 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.302614 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.306614 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.309521 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.310564 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.314793 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.315240 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.317307 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.317809 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.317969 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.318760 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.324630 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.324819 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.338116 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.338508 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.340000 4867 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8" exitCode=255 Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.340063 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8"} Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.355637 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.361131 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.361292 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.363627 4867 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.374309 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.385672 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.386759 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.395702 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.395787 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.395812 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.395950 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396208 4867 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396228 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396241 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396263 4867 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396291 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396300 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396310 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396319 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396328 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396337 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396346 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396354 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396362 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396371 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396382 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396391 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396400 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396409 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396418 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396429 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396437 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396492 4867 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396524 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396535 4867 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396546 4867 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396557 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396569 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396579 4867 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396588 4867 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396601 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396611 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396623 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396633 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396646 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396656 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396666 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396675 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396684 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396694 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396703 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396727 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396737 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396747 4867 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396756 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396765 4867 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396774 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396783 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396794 4867 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396808 4867 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396824 4867 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396838 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396864 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396878 4867 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396895 4867 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396908 4867 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396933 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396946 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396983 4867 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.396996 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397014 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397026 4867 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397038 4867 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397052 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397066 4867 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397077 4867 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397090 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397099 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397111 4867 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397120 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397132 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397143 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397153 4867 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397165 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397174 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397184 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397194 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397206 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397217 4867 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397228 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397239 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397273 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397289 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397300 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397310 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397320 4867 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397329 4867 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397338 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397348 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397358 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397369 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397379 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397389 4867 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397401 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397414 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397427 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397439 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397451 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397464 4867 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397478 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397507 4867 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397522 4867 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397536 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397550 4867 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397563 4867 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397575 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397590 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397602 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397617 4867 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397630 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397644 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397659 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397674 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397688 4867 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397701 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397715 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397729 4867 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397755 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397767 4867 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397777 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397786 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397797 4867 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397806 4867 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397816 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397826 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397837 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397847 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397857 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397866 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397875 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397885 4867 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397895 4867 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397904 4867 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397913 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397924 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397936 4867 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397947 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397960 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397972 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397984 4867 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.397996 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.398007 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.406558 4867 scope.go:117] "RemoveContainer" containerID="0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.407763 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.408447 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.423604 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.435927 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.446646 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.455880 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.467427 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.469831 4867 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.477925 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.478871 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.485215 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.490241 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.493489 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: W1006 13:04:01.493933 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-54331f03846001c0ee32d5ae2bb3292e8c5ec53dad9c03fcffb9ba216eceef39 WatchSource:0}: Error finding container 54331f03846001c0ee32d5ae2bb3292e8c5ec53dad9c03fcffb9ba216eceef39: Status 404 returned error can't find the container with id 54331f03846001c0ee32d5ae2bb3292e8c5ec53dad9c03fcffb9ba216eceef39 Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.511589 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.528342 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.550452 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.570410 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.587177 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.608218 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.629900 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.745768 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-shmxq"] Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.746335 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.746369 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-sdmmb"] Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.747007 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sdmmb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.751579 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.751746 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zlc7z"] Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.751894 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.752511 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.753579 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-rssjd"] Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.754584 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-knnfm"] Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.754849 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.755128 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.755750 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.758013 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.758241 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.758546 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.759379 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.759777 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.759795 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.763542 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.763651 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.764793 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.764819 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.764839 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.764850 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.764883 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.764902 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.765176 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.765300 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.765307 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.765468 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.766639 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.796308 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.800881 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.800957 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.800985 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhm7v\" (UniqueName: \"kubernetes.io/projected/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-kube-api-access-fhm7v\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801000 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-env-overrides\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801015 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-cni-binary-copy\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801029 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d52bd1ba-10f1-40c3-a0e7-f6e051234752-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801043 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d52bd1ba-10f1-40c3-a0e7-f6e051234752-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801057 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/93569a52-4f36-4017-9834-b3651d6cd63e-ovn-node-metrics-cert\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801072 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-ovnkube-script-lib\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801106 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-897ln\" (UniqueName: \"kubernetes.io/projected/9f5dc284-392f-4e65-9f43-cb9ced2e47d3-kube-api-access-897ln\") pod \"machine-config-daemon-shmxq\" (UID: \"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\") " pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801126 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801141 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-var-lib-cni-multus\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801159 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-run-k8s-cni-cncf-io\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801172 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-etc-kubernetes\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801186 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-multus-cni-dir\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801199 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d52bd1ba-10f1-40c3-a0e7-f6e051234752-cnibin\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801214 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d52bd1ba-10f1-40c3-a0e7-f6e051234752-os-release\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801229 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d52bd1ba-10f1-40c3-a0e7-f6e051234752-cni-binary-copy\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801245 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-slash\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801293 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-os-release\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801308 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-run-netns\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801323 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-run-netns\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801339 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-var-lib-openvswitch\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801353 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-run-ovn-kubernetes\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801367 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-cni-netd\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801382 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqjws\" (UniqueName: \"kubernetes.io/projected/93569a52-4f36-4017-9834-b3651d6cd63e-kube-api-access-rqjws\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801400 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-cnibin\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801416 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-var-lib-kubelet\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801431 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801447 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhnn5\" (UniqueName: \"kubernetes.io/projected/d52bd1ba-10f1-40c3-a0e7-f6e051234752-kube-api-access-jhnn5\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801462 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-cni-bin\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801481 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801494 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-var-lib-cni-bin\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801508 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-run-multus-certs\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801524 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-etc-openvswitch\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801540 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-log-socket\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801560 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-openvswitch\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801579 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-kubelet\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801596 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9f5dc284-392f-4e65-9f43-cb9ced2e47d3-rootfs\") pod \"machine-config-daemon-shmxq\" (UID: \"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\") " pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801613 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f5dc284-392f-4e65-9f43-cb9ced2e47d3-proxy-tls\") pod \"machine-config-daemon-shmxq\" (UID: \"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\") " pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801633 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801652 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-multus-daemon-config\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801673 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-ovnkube-config\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801689 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3daf6dcd-ed6a-4a39-892d-2c65de264a48-hosts-file\") pod \"node-resolver-sdmmb\" (UID: \"3daf6dcd-ed6a-4a39-892d-2c65de264a48\") " pod="openshift-dns/node-resolver-sdmmb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801703 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-node-log\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801717 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-multus-conf-dir\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801731 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scfz5\" (UniqueName: \"kubernetes.io/projected/3daf6dcd-ed6a-4a39-892d-2c65de264a48-kube-api-access-scfz5\") pod \"node-resolver-sdmmb\" (UID: \"3daf6dcd-ed6a-4a39-892d-2c65de264a48\") " pod="openshift-dns/node-resolver-sdmmb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801746 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9f5dc284-392f-4e65-9f43-cb9ced2e47d3-mcd-auth-proxy-config\") pod \"machine-config-daemon-shmxq\" (UID: \"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\") " pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801760 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-system-cni-dir\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801775 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-hostroot\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801789 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d52bd1ba-10f1-40c3-a0e7-f6e051234752-system-cni-dir\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801807 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-systemd-units\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801825 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-multus-socket-dir-parent\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801843 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-systemd\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.801860 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-ovn\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.801942 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:04:02.801927581 +0000 UTC m=+22.259875725 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.802022 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.802035 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.802045 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.802081 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:02.802071615 +0000 UTC m=+22.260019749 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.802174 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.802203 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:02.802188858 +0000 UTC m=+22.260137002 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.802345 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.802364 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:02.802358862 +0000 UTC m=+22.260307006 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.802442 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.802452 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.802458 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:01 crc kubenswrapper[4867]: E1006 13:04:01.802478 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:02.802471725 +0000 UTC m=+22.260419859 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.814980 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.826871 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.842982 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.860895 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.871386 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.880125 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.890213 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.902934 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3daf6dcd-ed6a-4a39-892d-2c65de264a48-hosts-file\") pod \"node-resolver-sdmmb\" (UID: \"3daf6dcd-ed6a-4a39-892d-2c65de264a48\") " pod="openshift-dns/node-resolver-sdmmb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.902967 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-node-log\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.902985 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-multus-conf-dir\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903002 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scfz5\" (UniqueName: \"kubernetes.io/projected/3daf6dcd-ed6a-4a39-892d-2c65de264a48-kube-api-access-scfz5\") pod \"node-resolver-sdmmb\" (UID: \"3daf6dcd-ed6a-4a39-892d-2c65de264a48\") " pod="openshift-dns/node-resolver-sdmmb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903019 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9f5dc284-392f-4e65-9f43-cb9ced2e47d3-mcd-auth-proxy-config\") pod \"machine-config-daemon-shmxq\" (UID: \"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\") " pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903035 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d52bd1ba-10f1-40c3-a0e7-f6e051234752-system-cni-dir\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903051 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-systemd-units\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903068 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-system-cni-dir\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903083 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-hostroot\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903097 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-ovn\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903114 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-multus-socket-dir-parent\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903130 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-systemd\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903143 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3daf6dcd-ed6a-4a39-892d-2c65de264a48-hosts-file\") pod \"node-resolver-sdmmb\" (UID: \"3daf6dcd-ed6a-4a39-892d-2c65de264a48\") " pod="openshift-dns/node-resolver-sdmmb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903194 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-ovn\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903146 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-env-overrides\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903217 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-system-cni-dir\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903220 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-systemd-units\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903263 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhm7v\" (UniqueName: \"kubernetes.io/projected/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-kube-api-access-fhm7v\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903276 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-hostroot\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903283 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/93569a52-4f36-4017-9834-b3651d6cd63e-ovn-node-metrics-cert\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903304 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-ovnkube-script-lib\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903307 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-multus-conf-dir\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903321 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-897ln\" (UniqueName: \"kubernetes.io/projected/9f5dc284-392f-4e65-9f43-cb9ced2e47d3-kube-api-access-897ln\") pod \"machine-config-daemon-shmxq\" (UID: \"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\") " pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903346 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-cni-binary-copy\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903362 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d52bd1ba-10f1-40c3-a0e7-f6e051234752-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903376 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d52bd1ba-10f1-40c3-a0e7-f6e051234752-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903404 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-var-lib-cni-multus\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903429 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-run-k8s-cni-cncf-io\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903446 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-etc-kubernetes\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903464 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d52bd1ba-10f1-40c3-a0e7-f6e051234752-cnibin\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903481 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d52bd1ba-10f1-40c3-a0e7-f6e051234752-os-release\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903501 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d52bd1ba-10f1-40c3-a0e7-f6e051234752-cni-binary-copy\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903518 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-slash\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903535 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-multus-cni-dir\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903550 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-var-lib-openvswitch\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903568 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-run-ovn-kubernetes\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903584 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-cni-netd\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903599 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqjws\" (UniqueName: \"kubernetes.io/projected/93569a52-4f36-4017-9834-b3651d6cd63e-kube-api-access-rqjws\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903614 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-os-release\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903628 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-run-netns\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903634 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-env-overrides\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903641 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-run-netns\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903085 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-node-log\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903696 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-cnibin\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903736 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d52bd1ba-10f1-40c3-a0e7-f6e051234752-system-cni-dir\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903751 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-multus-socket-dir-parent\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903658 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-cnibin\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903774 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-systemd\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903787 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-var-lib-kubelet\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903805 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903823 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhnn5\" (UniqueName: \"kubernetes.io/projected/d52bd1ba-10f1-40c3-a0e7-f6e051234752-kube-api-access-jhnn5\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903835 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d52bd1ba-10f1-40c3-a0e7-f6e051234752-os-release\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903840 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-cni-bin\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903856 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-log-socket\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903882 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-var-lib-cni-bin\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903905 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-run-multus-certs\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903930 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-etc-openvswitch\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903951 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9f5dc284-392f-4e65-9f43-cb9ced2e47d3-mcd-auth-proxy-config\") pod \"machine-config-daemon-shmxq\" (UID: \"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\") " pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903956 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-openvswitch\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903976 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-kubelet\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.903992 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9f5dc284-392f-4e65-9f43-cb9ced2e47d3-rootfs\") pod \"machine-config-daemon-shmxq\" (UID: \"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\") " pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904008 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f5dc284-392f-4e65-9f43-cb9ced2e47d3-proxy-tls\") pod \"machine-config-daemon-shmxq\" (UID: \"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\") " pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904030 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-multus-daemon-config\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904044 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-ovnkube-config\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904460 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d52bd1ba-10f1-40c3-a0e7-f6e051234752-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904508 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-ovnkube-script-lib\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904561 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-ovnkube-config\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904567 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-var-lib-cni-multus\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904597 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-run-k8s-cni-cncf-io\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904602 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-var-lib-kubelet\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904623 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-etc-kubernetes\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904630 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904652 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d52bd1ba-10f1-40c3-a0e7-f6e051234752-cnibin\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904683 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-var-lib-cni-bin\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904721 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-run-multus-certs\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904770 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9f5dc284-392f-4e65-9f43-cb9ced2e47d3-rootfs\") pod \"machine-config-daemon-shmxq\" (UID: \"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\") " pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904814 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-kubelet\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904876 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-host-run-netns\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.904889 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-cni-bin\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.905036 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-run-netns\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.905037 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-log-socket\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.905098 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-multus-cni-dir\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.905151 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-slash\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.905153 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-cni-binary-copy\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.905192 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-os-release\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.905193 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-cni-netd\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.905225 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-etc-openvswitch\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.905227 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d52bd1ba-10f1-40c3-a0e7-f6e051234752-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.905261 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-openvswitch\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.905285 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-run-ovn-kubernetes\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.905317 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-var-lib-openvswitch\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.905504 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d52bd1ba-10f1-40c3-a0e7-f6e051234752-cni-binary-copy\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.905656 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-multus-daemon-config\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.907525 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/93569a52-4f36-4017-9834-b3651d6cd63e-ovn-node-metrics-cert\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.907839 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.908107 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f5dc284-392f-4e65-9f43-cb9ced2e47d3-proxy-tls\") pod \"machine-config-daemon-shmxq\" (UID: \"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\") " pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.920432 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-897ln\" (UniqueName: \"kubernetes.io/projected/9f5dc284-392f-4e65-9f43-cb9ced2e47d3-kube-api-access-897ln\") pod \"machine-config-daemon-shmxq\" (UID: \"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\") " pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.920969 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.921555 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqjws\" (UniqueName: \"kubernetes.io/projected/93569a52-4f36-4017-9834-b3651d6cd63e-kube-api-access-rqjws\") pod \"ovnkube-node-zlc7z\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.922116 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhm7v\" (UniqueName: \"kubernetes.io/projected/8e3bebeb-f8c1-4b1e-a320-b937eced1c3a-kube-api-access-fhm7v\") pod \"multus-knnfm\" (UID: \"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\") " pod="openshift-multus/multus-knnfm" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.927150 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scfz5\" (UniqueName: \"kubernetes.io/projected/3daf6dcd-ed6a-4a39-892d-2c65de264a48-kube-api-access-scfz5\") pod \"node-resolver-sdmmb\" (UID: \"3daf6dcd-ed6a-4a39-892d-2c65de264a48\") " pod="openshift-dns/node-resolver-sdmmb" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.929701 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhnn5\" (UniqueName: \"kubernetes.io/projected/d52bd1ba-10f1-40c3-a0e7-f6e051234752-kube-api-access-jhnn5\") pod \"multus-additional-cni-plugins-rssjd\" (UID: \"d52bd1ba-10f1-40c3-a0e7-f6e051234752\") " pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.933876 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.943889 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.954507 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.971066 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:01 crc kubenswrapper[4867]: I1006 13:04:01.986425 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.001683 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.015431 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.031261 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.047296 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.057942 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.058348 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.068186 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sdmmb" Oct 06 13:04:02 crc kubenswrapper[4867]: W1006 13:04:02.068266 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f5dc284_392f_4e65_9f43_cb9ced2e47d3.slice/crio-ec296491c19f35ff8afab5ae213dc5b36efc0b80674fb7f900fe1a2ada0b8462 WatchSource:0}: Error finding container ec296491c19f35ff8afab5ae213dc5b36efc0b80674fb7f900fe1a2ada0b8462: Status 404 returned error can't find the container with id ec296491c19f35ff8afab5ae213dc5b36efc0b80674fb7f900fe1a2ada0b8462 Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.070623 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.081876 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.081945 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.094109 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-knnfm" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.103934 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rssjd" Oct 06 13:04:02 crc kubenswrapper[4867]: W1006 13:04:02.116806 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e3bebeb_f8c1_4b1e_a320_b937eced1c3a.slice/crio-6b675cf5000638bf6ab88bab91f1dcf885ef9d047ea56de3df0a32c0dd5824f4 WatchSource:0}: Error finding container 6b675cf5000638bf6ab88bab91f1dcf885ef9d047ea56de3df0a32c0dd5824f4: Status 404 returned error can't find the container with id 6b675cf5000638bf6ab88bab91f1dcf885ef9d047ea56de3df0a32c0dd5824f4 Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.288013 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.303702 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.322222 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.338765 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.345716 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.353053 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.353360 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.356172 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ce8aa87b981a783c23faeccf52e608f84fce51c0c28076e11377be341d0ecb9c"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.360347 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.362004 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.362072 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.362093 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"54331f03846001c0ee32d5ae2bb3292e8c5ec53dad9c03fcffb9ba216eceef39"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.363942 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-knnfm" event={"ID":"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a","Type":"ContainerStarted","Data":"2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.363990 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-knnfm" event={"ID":"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a","Type":"ContainerStarted","Data":"6b675cf5000638bf6ab88bab91f1dcf885ef9d047ea56de3df0a32c0dd5824f4"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.366435 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.366470 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"ec296491c19f35ff8afab5ae213dc5b36efc0b80674fb7f900fe1a2ada0b8462"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.367839 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.367886 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"692cb34301014bfb9d57c00a27f38026b5bc78771fb6394241acc65233e5c2ad"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.369378 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" event={"ID":"d52bd1ba-10f1-40c3-a0e7-f6e051234752","Type":"ContainerStarted","Data":"42c19ae8f400a87e8b073389a4b3d616a86e913ecd52d79968ed741703eeaa2d"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.371029 4867 generic.go:334] "Generic (PLEG): container finished" podID="93569a52-4f36-4017-9834-b3651d6cd63e" containerID="0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38" exitCode=0 Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.371078 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerDied","Data":"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.371098 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerStarted","Data":"e98a11e9ef63972ce1b8ba9d81d42ae322f95e27805595687b5d3c76065279c8"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.375836 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sdmmb" event={"ID":"3daf6dcd-ed6a-4a39-892d-2c65de264a48","Type":"ContainerStarted","Data":"70f5aa1c562a15ca96f6fa9d1e30e2efcc0ff7d6a4103841a36c53b45e13bddd"} Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.394737 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.428156 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.452300 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.470722 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.488060 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.499881 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.515031 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.535145 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.552046 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.570670 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.599103 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.620538 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.645454 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.662158 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.675093 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.708073 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.751224 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.795541 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.812114 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.812210 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.812240 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:02 crc kubenswrapper[4867]: E1006 13:04:02.812308 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:04:04.812279003 +0000 UTC m=+24.270227147 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.812358 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.812410 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:02 crc kubenswrapper[4867]: E1006 13:04:02.812429 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:04:02 crc kubenswrapper[4867]: E1006 13:04:02.812449 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:04:02 crc kubenswrapper[4867]: E1006 13:04:02.812466 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:02 crc kubenswrapper[4867]: E1006 13:04:02.812521 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:04.812503348 +0000 UTC m=+24.270451492 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:02 crc kubenswrapper[4867]: E1006 13:04:02.812561 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:04:02 crc kubenswrapper[4867]: E1006 13:04:02.812570 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:04:02 crc kubenswrapper[4867]: E1006 13:04:02.812596 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:04.81258872 +0000 UTC m=+24.270536874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:04:02 crc kubenswrapper[4867]: E1006 13:04:02.812647 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:04.812639821 +0000 UTC m=+24.270587965 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:04:02 crc kubenswrapper[4867]: E1006 13:04:02.812656 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:04:02 crc kubenswrapper[4867]: E1006 13:04:02.812670 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:04:02 crc kubenswrapper[4867]: E1006 13:04:02.812681 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:02 crc kubenswrapper[4867]: E1006 13:04:02.812713 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:04.812706123 +0000 UTC m=+24.270654267 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.827998 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.874168 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.906873 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:02 crc kubenswrapper[4867]: I1006 13:04:02.947574 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.221285 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.221345 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.221320 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:03 crc kubenswrapper[4867]: E1006 13:04:03.221518 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:03 crc kubenswrapper[4867]: E1006 13:04:03.221654 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:03 crc kubenswrapper[4867]: E1006 13:04:03.221786 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.225490 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.226378 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.227130 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.227827 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.228455 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.229037 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.229630 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.230239 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.230816 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.231344 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.231988 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.232529 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.233057 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.233618 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.234056 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.234743 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.235274 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.238604 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.239068 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.239715 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.240575 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.241141 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.242159 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.242669 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.243847 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.244370 4867 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.244468 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.246562 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.247291 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.247665 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.249135 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.250095 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.250656 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.251726 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.252753 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.253741 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.254748 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.255417 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.256197 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.256696 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.257139 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.380995 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sdmmb" event={"ID":"3daf6dcd-ed6a-4a39-892d-2c65de264a48","Type":"ContainerStarted","Data":"407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f"} Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.382796 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1"} Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.384350 4867 generic.go:334] "Generic (PLEG): container finished" podID="d52bd1ba-10f1-40c3-a0e7-f6e051234752" containerID="52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12" exitCode=0 Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.384425 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" event={"ID":"d52bd1ba-10f1-40c3-a0e7-f6e051234752","Type":"ContainerDied","Data":"52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12"} Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.391713 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerStarted","Data":"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d"} Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.391768 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerStarted","Data":"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3"} Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.391787 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerStarted","Data":"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7"} Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.391801 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerStarted","Data":"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7"} Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.391815 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerStarted","Data":"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8"} Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.391830 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerStarted","Data":"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315"} Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.402783 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.421474 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.434110 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.450108 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.475231 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.498934 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.515132 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.527129 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.539286 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.554161 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.583725 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.607348 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.620956 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.644137 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-x2x4x"] Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.644480 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x2x4x" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.649754 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.650702 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.651020 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.651179 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.661095 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.674920 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.688473 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.720247 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.721569 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fce7eafd-a44a-4e15-b02e-30800f29c4e7-serviceca\") pod \"node-ca-x2x4x\" (UID: \"fce7eafd-a44a-4e15-b02e-30800f29c4e7\") " pod="openshift-image-registry/node-ca-x2x4x" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.721632 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fce7eafd-a44a-4e15-b02e-30800f29c4e7-host\") pod \"node-ca-x2x4x\" (UID: \"fce7eafd-a44a-4e15-b02e-30800f29c4e7\") " pod="openshift-image-registry/node-ca-x2x4x" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.721655 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n27vn\" (UniqueName: \"kubernetes.io/projected/fce7eafd-a44a-4e15-b02e-30800f29c4e7-kube-api-access-n27vn\") pod \"node-ca-x2x4x\" (UID: \"fce7eafd-a44a-4e15-b02e-30800f29c4e7\") " pod="openshift-image-registry/node-ca-x2x4x" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.746641 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.783687 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.786825 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.787060 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.805615 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.822442 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n27vn\" (UniqueName: \"kubernetes.io/projected/fce7eafd-a44a-4e15-b02e-30800f29c4e7-kube-api-access-n27vn\") pod \"node-ca-x2x4x\" (UID: \"fce7eafd-a44a-4e15-b02e-30800f29c4e7\") " pod="openshift-image-registry/node-ca-x2x4x" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.822515 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fce7eafd-a44a-4e15-b02e-30800f29c4e7-serviceca\") pod \"node-ca-x2x4x\" (UID: \"fce7eafd-a44a-4e15-b02e-30800f29c4e7\") " pod="openshift-image-registry/node-ca-x2x4x" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.822555 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fce7eafd-a44a-4e15-b02e-30800f29c4e7-host\") pod \"node-ca-x2x4x\" (UID: \"fce7eafd-a44a-4e15-b02e-30800f29c4e7\") " pod="openshift-image-registry/node-ca-x2x4x" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.822623 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fce7eafd-a44a-4e15-b02e-30800f29c4e7-host\") pod \"node-ca-x2x4x\" (UID: \"fce7eafd-a44a-4e15-b02e-30800f29c4e7\") " pod="openshift-image-registry/node-ca-x2x4x" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.823776 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fce7eafd-a44a-4e15-b02e-30800f29c4e7-serviceca\") pod \"node-ca-x2x4x\" (UID: \"fce7eafd-a44a-4e15-b02e-30800f29c4e7\") " pod="openshift-image-registry/node-ca-x2x4x" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.864188 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.881241 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n27vn\" (UniqueName: \"kubernetes.io/projected/fce7eafd-a44a-4e15-b02e-30800f29c4e7-kube-api-access-n27vn\") pod \"node-ca-x2x4x\" (UID: \"fce7eafd-a44a-4e15-b02e-30800f29c4e7\") " pod="openshift-image-registry/node-ca-x2x4x" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.910187 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.945813 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:03 crc kubenswrapper[4867]: I1006 13:04:03.986141 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.005972 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x2x4x" Oct 06 13:04:04 crc kubenswrapper[4867]: W1006 13:04:04.024507 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfce7eafd_a44a_4e15_b02e_30800f29c4e7.slice/crio-717c73b5bcbfa2bfd574335b7d6fc0660b07f79bba1f7d3b67f773f76caaeee7 WatchSource:0}: Error finding container 717c73b5bcbfa2bfd574335b7d6fc0660b07f79bba1f7d3b67f773f76caaeee7: Status 404 returned error can't find the container with id 717c73b5bcbfa2bfd574335b7d6fc0660b07f79bba1f7d3b67f773f76caaeee7 Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.031482 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.067917 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.106739 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.152228 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.187201 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.231703 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.264447 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.303603 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.346934 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.387101 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.396136 4867 generic.go:334] "Generic (PLEG): container finished" podID="d52bd1ba-10f1-40c3-a0e7-f6e051234752" containerID="75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67" exitCode=0 Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.396200 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" event={"ID":"d52bd1ba-10f1-40c3-a0e7-f6e051234752","Type":"ContainerDied","Data":"75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67"} Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.397276 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x2x4x" event={"ID":"fce7eafd-a44a-4e15-b02e-30800f29c4e7","Type":"ContainerStarted","Data":"ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab"} Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.397318 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x2x4x" event={"ID":"fce7eafd-a44a-4e15-b02e-30800f29c4e7","Type":"ContainerStarted","Data":"717c73b5bcbfa2bfd574335b7d6fc0660b07f79bba1f7d3b67f773f76caaeee7"} Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.398718 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2"} Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.427091 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.472676 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.503782 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.555538 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.586052 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.628036 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.670745 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.708688 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.745569 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.786749 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.824298 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.831804 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:04:04 crc kubenswrapper[4867]: E1006 13:04:04.831885 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:04:08.831867438 +0000 UTC m=+28.289815582 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.831913 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.831947 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.831971 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.831996 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:04 crc kubenswrapper[4867]: E1006 13:04:04.832073 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:04:04 crc kubenswrapper[4867]: E1006 13:04:04.832102 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:08.832095584 +0000 UTC m=+28.290043718 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:04:04 crc kubenswrapper[4867]: E1006 13:04:04.832284 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:04:04 crc kubenswrapper[4867]: E1006 13:04:04.832303 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:04:04 crc kubenswrapper[4867]: E1006 13:04:04.832313 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:04 crc kubenswrapper[4867]: E1006 13:04:04.832345 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:08.83233634 +0000 UTC m=+28.290284474 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:04 crc kubenswrapper[4867]: E1006 13:04:04.832346 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:04:04 crc kubenswrapper[4867]: E1006 13:04:04.832376 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:04:04 crc kubenswrapper[4867]: E1006 13:04:04.832402 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:04:04 crc kubenswrapper[4867]: E1006 13:04:04.832417 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:04 crc kubenswrapper[4867]: E1006 13:04:04.832421 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:08.832406021 +0000 UTC m=+28.290354165 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:04:04 crc kubenswrapper[4867]: E1006 13:04:04.832455 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:08.832442222 +0000 UTC m=+28.290390476 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.865929 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.902516 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.948580 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:04 crc kubenswrapper[4867]: I1006 13:04:04.984829 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.025898 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.065280 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.113755 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.145618 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.187067 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.221201 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.221280 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:05 crc kubenswrapper[4867]: E1006 13:04:05.221365 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.221209 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:05 crc kubenswrapper[4867]: E1006 13:04:05.221500 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:05 crc kubenswrapper[4867]: E1006 13:04:05.221583 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.225892 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.265479 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.306662 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.349796 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.403228 4867 generic.go:334] "Generic (PLEG): container finished" podID="d52bd1ba-10f1-40c3-a0e7-f6e051234752" containerID="ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0" exitCode=0 Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.403283 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" event={"ID":"d52bd1ba-10f1-40c3-a0e7-f6e051234752","Type":"ContainerDied","Data":"ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0"} Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.407647 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerStarted","Data":"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9"} Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.429575 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.460828 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.487374 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.504459 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.542981 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.587095 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.628882 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.665155 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.708711 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.745009 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.787990 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.824932 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.868610 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.908823 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:05 crc kubenswrapper[4867]: I1006 13:04:05.945626 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:05Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.413439 4867 generic.go:334] "Generic (PLEG): container finished" podID="d52bd1ba-10f1-40c3-a0e7-f6e051234752" containerID="05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34" exitCode=0 Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.413485 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" event={"ID":"d52bd1ba-10f1-40c3-a0e7-f6e051234752","Type":"ContainerDied","Data":"05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34"} Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.428927 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.448690 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.463399 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.478452 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.490671 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.505637 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.523604 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.533275 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.544583 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.561855 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.572121 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.591901 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.603848 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.613805 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.625061 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.809550 4867 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.811777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.811830 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.811842 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.811982 4867 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.817941 4867 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.818189 4867 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.819195 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.819221 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.819231 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.819244 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.819269 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:06Z","lastTransitionTime":"2025-10-06T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:06 crc kubenswrapper[4867]: E1006 13:04:06.832362 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.836413 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.836467 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.836479 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.836499 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.836511 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:06Z","lastTransitionTime":"2025-10-06T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:06 crc kubenswrapper[4867]: E1006 13:04:06.847302 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.851003 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.851032 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.851043 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.851060 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.851070 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:06Z","lastTransitionTime":"2025-10-06T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:06 crc kubenswrapper[4867]: E1006 13:04:06.861966 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.865756 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.865801 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.865816 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.865833 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.865843 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:06Z","lastTransitionTime":"2025-10-06T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:06 crc kubenswrapper[4867]: E1006 13:04:06.876827 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.880572 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.880613 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.880623 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.880642 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.880653 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:06Z","lastTransitionTime":"2025-10-06T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:06 crc kubenswrapper[4867]: E1006 13:04:06.891965 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:06Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:06 crc kubenswrapper[4867]: E1006 13:04:06.892078 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.893690 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.893742 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.893753 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.893767 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.893776 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:06Z","lastTransitionTime":"2025-10-06T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.995867 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.995908 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.995916 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.995930 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:06 crc kubenswrapper[4867]: I1006 13:04:06.995979 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:06Z","lastTransitionTime":"2025-10-06T13:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.098071 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.098111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.098122 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.098138 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.098149 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:07Z","lastTransitionTime":"2025-10-06T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.200757 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.200791 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.200800 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.200813 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.200822 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:07Z","lastTransitionTime":"2025-10-06T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.221104 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.221130 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:07 crc kubenswrapper[4867]: E1006 13:04:07.221600 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.221279 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:07 crc kubenswrapper[4867]: E1006 13:04:07.221692 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:07 crc kubenswrapper[4867]: E1006 13:04:07.221859 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.304707 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.304752 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.304763 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.304782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.304793 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:07Z","lastTransitionTime":"2025-10-06T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.407221 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.407268 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.407279 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.407293 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.407306 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:07Z","lastTransitionTime":"2025-10-06T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.418503 4867 generic.go:334] "Generic (PLEG): container finished" podID="d52bd1ba-10f1-40c3-a0e7-f6e051234752" containerID="d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3" exitCode=0 Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.418530 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" event={"ID":"d52bd1ba-10f1-40c3-a0e7-f6e051234752","Type":"ContainerDied","Data":"d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3"} Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.422788 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerStarted","Data":"7c9f68372cbb78ff8bf0ffa37b78d74dc85b2b92dbb6f126658c4e5147c9d175"} Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.435236 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.449494 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.463870 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.475482 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.490930 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.508442 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.510846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.510889 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.510905 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.510923 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.510936 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:07Z","lastTransitionTime":"2025-10-06T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.547832 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.576208 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.592093 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.601454 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.612447 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.613171 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.613217 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.613230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.613264 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.613278 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:07Z","lastTransitionTime":"2025-10-06T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.629317 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.639490 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.656519 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.668218 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.679283 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.691434 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.704305 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.715963 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.715994 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.716004 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.716019 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.716030 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:07Z","lastTransitionTime":"2025-10-06T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.717235 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.728423 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.744268 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.756346 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.776187 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.787685 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.799375 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.811086 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.818490 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.818518 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.818526 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.818540 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.818550 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:07Z","lastTransitionTime":"2025-10-06T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.822633 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.833372 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.844216 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.860059 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c9f68372cbb78ff8bf0ffa37b78d74dc85b2b92dbb6f126658c4e5147c9d175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:07Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.921126 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.921172 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.921182 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.921198 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:07 crc kubenswrapper[4867]: I1006 13:04:07.921209 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:07Z","lastTransitionTime":"2025-10-06T13:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.023795 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.023841 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.023849 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.023864 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.023874 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:08Z","lastTransitionTime":"2025-10-06T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.125856 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.125896 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.125905 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.125921 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.125931 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:08Z","lastTransitionTime":"2025-10-06T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.227993 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.228059 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.228078 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.228120 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.228140 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:08Z","lastTransitionTime":"2025-10-06T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.330770 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.330804 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.330814 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.330825 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.330834 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:08Z","lastTransitionTime":"2025-10-06T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.429408 4867 generic.go:334] "Generic (PLEG): container finished" podID="d52bd1ba-10f1-40c3-a0e7-f6e051234752" containerID="ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b" exitCode=0 Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.429468 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" event={"ID":"d52bd1ba-10f1-40c3-a0e7-f6e051234752","Type":"ContainerDied","Data":"ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b"} Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.429513 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.430227 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.430281 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.433292 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.433379 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.433396 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.433418 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.433486 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:08Z","lastTransitionTime":"2025-10-06T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.443679 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.463084 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.468058 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.470663 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c9f68372cbb78ff8bf0ffa37b78d74dc85b2b92dbb6f126658c4e5147c9d175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.490543 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.502848 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.515580 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.528905 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.536239 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.536308 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.536323 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.536344 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.536358 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:08Z","lastTransitionTime":"2025-10-06T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.539956 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.550708 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.564825 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.574882 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.588584 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.602234 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.616095 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.630376 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.639394 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.639437 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.639446 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.639463 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.639472 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:08Z","lastTransitionTime":"2025-10-06T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.644365 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.656521 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.666849 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.677610 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.688545 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.698949 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.709069 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.728720 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c9f68372cbb78ff8bf0ffa37b78d74dc85b2b92dbb6f126658c4e5147c9d175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.741814 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.741863 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.741876 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.741898 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.741913 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:08Z","lastTransitionTime":"2025-10-06T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.742911 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.756894 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.785981 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.835781 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.844172 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.844222 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.844236 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.844269 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.844278 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:08Z","lastTransitionTime":"2025-10-06T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.865708 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.872013 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.872140 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:08 crc kubenswrapper[4867]: E1006 13:04:08.872183 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:04:16.872163265 +0000 UTC m=+36.330111409 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.872215 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:08 crc kubenswrapper[4867]: E1006 13:04:08.872275 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:04:08 crc kubenswrapper[4867]: E1006 13:04:08.872295 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:04:08 crc kubenswrapper[4867]: E1006 13:04:08.872306 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:08 crc kubenswrapper[4867]: E1006 13:04:08.872331 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:04:08 crc kubenswrapper[4867]: E1006 13:04:08.872340 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:16.872328089 +0000 UTC m=+36.330276233 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:08 crc kubenswrapper[4867]: E1006 13:04:08.872355 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:04:08 crc kubenswrapper[4867]: E1006 13:04:08.872365 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:16.872358089 +0000 UTC m=+36.330306333 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.872280 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:08 crc kubenswrapper[4867]: E1006 13:04:08.872401 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:16.87239224 +0000 UTC m=+36.330340384 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.872606 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:08 crc kubenswrapper[4867]: E1006 13:04:08.872956 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:04:08 crc kubenswrapper[4867]: E1006 13:04:08.873016 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:04:08 crc kubenswrapper[4867]: E1006 13:04:08.873046 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:08 crc kubenswrapper[4867]: E1006 13:04:08.873176 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:16.873133178 +0000 UTC m=+36.331081442 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.903927 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.945932 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.945965 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.945974 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.945987 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.945996 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:08Z","lastTransitionTime":"2025-10-06T13:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.948001 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:08 crc kubenswrapper[4867]: I1006 13:04:08.984907 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.048339 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.048393 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.048406 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.048424 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.048439 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:09Z","lastTransitionTime":"2025-10-06T13:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.150940 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.151001 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.151018 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.151040 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.151060 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:09Z","lastTransitionTime":"2025-10-06T13:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.220964 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.221064 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.221131 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:09 crc kubenswrapper[4867]: E1006 13:04:09.221112 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:09 crc kubenswrapper[4867]: E1006 13:04:09.221143 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:09 crc kubenswrapper[4867]: E1006 13:04:09.221215 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.253710 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.253745 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.253754 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.253769 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.253782 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:09Z","lastTransitionTime":"2025-10-06T13:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.355948 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.355983 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.355994 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.356010 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.356022 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:09Z","lastTransitionTime":"2025-10-06T13:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.435380 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.435898 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" event={"ID":"d52bd1ba-10f1-40c3-a0e7-f6e051234752","Type":"ContainerStarted","Data":"e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05"} Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.451219 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.458135 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.458167 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.458177 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.458194 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.458206 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:09Z","lastTransitionTime":"2025-10-06T13:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.463082 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.480396 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.497966 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.512056 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.523986 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.539902 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c9f68372cbb78ff8bf0ffa37b78d74dc85b2b92dbb6f126658c4e5147c9d175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.550600 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.560160 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.560214 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.560224 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.560239 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.560262 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:09Z","lastTransitionTime":"2025-10-06T13:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.564781 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.574647 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.597854 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.612087 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.624930 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.642928 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.659810 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:09Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.662298 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.662329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.662339 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.662352 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.662361 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:09Z","lastTransitionTime":"2025-10-06T13:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.764733 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.764776 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.764785 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.764800 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.764808 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:09Z","lastTransitionTime":"2025-10-06T13:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.866654 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.866689 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.866729 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.866755 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.866784 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:09Z","lastTransitionTime":"2025-10-06T13:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.969190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.969233 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.969262 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.969280 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:09 crc kubenswrapper[4867]: I1006 13:04:09.969311 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:09Z","lastTransitionTime":"2025-10-06T13:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.072817 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.073045 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.073053 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.073066 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.073074 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:10Z","lastTransitionTime":"2025-10-06T13:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.175796 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.175831 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.175840 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.175852 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.175862 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:10Z","lastTransitionTime":"2025-10-06T13:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.277832 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.277874 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.277883 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.277898 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.277916 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:10Z","lastTransitionTime":"2025-10-06T13:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.380209 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.380276 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.380290 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.380317 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.380329 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:10Z","lastTransitionTime":"2025-10-06T13:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.439562 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/0.log" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.442085 4867 generic.go:334] "Generic (PLEG): container finished" podID="93569a52-4f36-4017-9834-b3651d6cd63e" containerID="7c9f68372cbb78ff8bf0ffa37b78d74dc85b2b92dbb6f126658c4e5147c9d175" exitCode=1 Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.442156 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerDied","Data":"7c9f68372cbb78ff8bf0ffa37b78d74dc85b2b92dbb6f126658c4e5147c9d175"} Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.443545 4867 scope.go:117] "RemoveContainer" containerID="7c9f68372cbb78ff8bf0ffa37b78d74dc85b2b92dbb6f126658c4e5147c9d175" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.455910 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.469735 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.481838 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.482122 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.482153 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.482166 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.482180 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.482192 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:10Z","lastTransitionTime":"2025-10-06T13:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.499007 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.512139 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.524448 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.543535 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c9f68372cbb78ff8bf0ffa37b78d74dc85b2b92dbb6f126658c4e5147c9d175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c9f68372cbb78ff8bf0ffa37b78d74dc85b2b92dbb6f126658c4e5147c9d175\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:10Z\\\",\\\"message\\\":\\\"ler/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 13:04:10.115040 6123 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 13:04:10.115058 6123 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 13:04:10.115115 6123 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 13:04:10.115124 6123 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 13:04:10.115143 6123 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 13:04:10.115160 6123 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 13:04:10.115177 6123 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 13:04:10.115194 6123 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.554482 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.571653 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.584809 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.584845 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.584856 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.584870 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.584881 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:10Z","lastTransitionTime":"2025-10-06T13:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.589834 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.615033 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.625573 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.637226 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.646588 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.656660 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:10Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.687321 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.687361 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.687369 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.687385 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.687397 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:10Z","lastTransitionTime":"2025-10-06T13:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.789643 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.789682 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.789691 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.789704 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.789714 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:10Z","lastTransitionTime":"2025-10-06T13:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.891335 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.891366 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.891375 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.891387 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.891396 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:10Z","lastTransitionTime":"2025-10-06T13:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.993457 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.993503 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.993514 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.993530 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:10 crc kubenswrapper[4867]: I1006 13:04:10.993542 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:10Z","lastTransitionTime":"2025-10-06T13:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.098108 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.098161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.098173 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.098190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.098202 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:11Z","lastTransitionTime":"2025-10-06T13:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.200732 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.200761 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.200769 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.200781 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.200790 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:11Z","lastTransitionTime":"2025-10-06T13:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.220219 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.220325 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:11 crc kubenswrapper[4867]: E1006 13:04:11.220358 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.220436 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:11 crc kubenswrapper[4867]: E1006 13:04:11.220500 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:11 crc kubenswrapper[4867]: E1006 13:04:11.220596 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.231855 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.247752 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c9f68372cbb78ff8bf0ffa37b78d74dc85b2b92dbb6f126658c4e5147c9d175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c9f68372cbb78ff8bf0ffa37b78d74dc85b2b92dbb6f126658c4e5147c9d175\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:10Z\\\",\\\"message\\\":\\\"ler/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 13:04:10.115040 6123 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 13:04:10.115058 6123 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 13:04:10.115115 6123 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 13:04:10.115124 6123 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 13:04:10.115143 6123 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 13:04:10.115160 6123 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 13:04:10.115177 6123 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 13:04:10.115194 6123 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.262686 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.271553 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.287964 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.298145 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.303234 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.303292 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.303307 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.303323 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.303334 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:11Z","lastTransitionTime":"2025-10-06T13:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.307680 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.318306 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.328944 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.337872 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.349866 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.360579 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.375107 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.386106 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.396799 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.405494 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.405522 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.405531 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.405545 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.405556 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:11Z","lastTransitionTime":"2025-10-06T13:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.446140 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/1.log" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.446703 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/0.log" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.449121 4867 generic.go:334] "Generic (PLEG): container finished" podID="93569a52-4f36-4017-9834-b3651d6cd63e" containerID="327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d" exitCode=1 Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.449154 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerDied","Data":"327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d"} Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.449197 4867 scope.go:117] "RemoveContainer" containerID="7c9f68372cbb78ff8bf0ffa37b78d74dc85b2b92dbb6f126658c4e5147c9d175" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.449675 4867 scope.go:117] "RemoveContainer" containerID="327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d" Oct 06 13:04:11 crc kubenswrapper[4867]: E1006 13:04:11.449800 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.458546 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.475860 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.489335 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.499322 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.507500 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.507644 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.507734 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.507805 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.507868 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:11Z","lastTransitionTime":"2025-10-06T13:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.510297 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.519991 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.529629 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.542061 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.553556 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.563491 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.574618 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.585710 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.595663 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.605959 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.610117 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.610146 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.610155 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.610169 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.610177 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:11Z","lastTransitionTime":"2025-10-06T13:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.621997 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c9f68372cbb78ff8bf0ffa37b78d74dc85b2b92dbb6f126658c4e5147c9d175\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:10Z\\\",\\\"message\\\":\\\"ler/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 13:04:10.115040 6123 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 13:04:10.115058 6123 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 13:04:10.115115 6123 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 13:04:10.115124 6123 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1006 13:04:10.115143 6123 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 13:04:10.115160 6123 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1006 13:04:10.115177 6123 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1006 13:04:10.115194 6123 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:11Z\\\",\\\"message\\\":\\\"s-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230527 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230531 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-rssjd in node crc\\\\nI1006 13:04:11.230535 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd after 0 failed attempt(s)\\\\nI1006 13:04:11.230539 6287 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230546 6287 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230552 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230562 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-knnfm in node crc\\\\nI1006 13:04:11.230567 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-knnfm after 0 failed attempt(s)\\\\nF1006 13:04:11.230569 6287 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.712338 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.712396 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.712405 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.712422 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.712434 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:11Z","lastTransitionTime":"2025-10-06T13:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.814777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.814813 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.814821 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.814834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.814842 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:11Z","lastTransitionTime":"2025-10-06T13:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.917521 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.917564 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.917573 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.917591 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:11 crc kubenswrapper[4867]: I1006 13:04:11.917601 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:11Z","lastTransitionTime":"2025-10-06T13:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.020006 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.020039 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.020048 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.020060 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.020166 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:12Z","lastTransitionTime":"2025-10-06T13:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.122578 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.122634 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.122647 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.122661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.122671 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:12Z","lastTransitionTime":"2025-10-06T13:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.224680 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.224730 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.224742 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.224763 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.224775 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:12Z","lastTransitionTime":"2025-10-06T13:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.327170 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.327207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.327218 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.327233 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.327243 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:12Z","lastTransitionTime":"2025-10-06T13:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.429724 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.429801 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.429826 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.429863 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.429886 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:12Z","lastTransitionTime":"2025-10-06T13:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.452891 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/1.log" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.457109 4867 scope.go:117] "RemoveContainer" containerID="327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d" Oct 06 13:04:12 crc kubenswrapper[4867]: E1006 13:04:12.457318 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.466887 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.490564 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.506868 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.519502 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.532343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.532372 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.532381 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.532394 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.532403 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:12Z","lastTransitionTime":"2025-10-06T13:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.532424 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.544064 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.554773 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.568904 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.584311 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.597305 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.614193 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.628879 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.634144 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.634187 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.634203 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.634224 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.634238 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:12Z","lastTransitionTime":"2025-10-06T13:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.644048 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.659661 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.686227 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:11Z\\\",\\\"message\\\":\\\"s-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230527 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230531 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-rssjd in node crc\\\\nI1006 13:04:11.230535 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd after 0 failed attempt(s)\\\\nI1006 13:04:11.230539 6287 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230546 6287 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230552 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230562 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-knnfm in node crc\\\\nI1006 13:04:11.230567 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-knnfm after 0 failed attempt(s)\\\\nF1006 13:04:11.230569 6287 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:12Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.737015 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.737053 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.737067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.737084 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.737096 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:12Z","lastTransitionTime":"2025-10-06T13:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.839578 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.839648 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.839664 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.840553 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.840618 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:12Z","lastTransitionTime":"2025-10-06T13:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.943022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.943055 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.943066 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.943085 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:12 crc kubenswrapper[4867]: I1006 13:04:12.943097 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:12Z","lastTransitionTime":"2025-10-06T13:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.046462 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.046499 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.046511 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.046527 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.046541 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:13Z","lastTransitionTime":"2025-10-06T13:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.148425 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.148468 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.148478 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.148493 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.148502 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:13Z","lastTransitionTime":"2025-10-06T13:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.220682 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:13 crc kubenswrapper[4867]: E1006 13:04:13.220794 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.220694 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.220880 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:13 crc kubenswrapper[4867]: E1006 13:04:13.220967 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:13 crc kubenswrapper[4867]: E1006 13:04:13.221124 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.251497 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.251529 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.251538 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.251555 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.251564 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:13Z","lastTransitionTime":"2025-10-06T13:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.354671 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.354714 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.354728 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.354745 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.354756 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:13Z","lastTransitionTime":"2025-10-06T13:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.458591 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.458642 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.458655 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.458672 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.458684 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:13Z","lastTransitionTime":"2025-10-06T13:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.562688 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.562717 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.562725 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.562737 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.562746 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:13Z","lastTransitionTime":"2025-10-06T13:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.665111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.665162 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.665171 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.665185 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.665195 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:13Z","lastTransitionTime":"2025-10-06T13:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.767167 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.767205 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.767214 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.767232 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.767241 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:13Z","lastTransitionTime":"2025-10-06T13:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.870133 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.870190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.870205 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.870221 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.870234 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:13Z","lastTransitionTime":"2025-10-06T13:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.973396 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.973497 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.973519 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.973543 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:13 crc kubenswrapper[4867]: I1006 13:04:13.973598 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:13Z","lastTransitionTime":"2025-10-06T13:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.076893 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.076955 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.076969 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.076989 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.077006 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:14Z","lastTransitionTime":"2025-10-06T13:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.180675 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.180758 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.180779 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.180813 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.180835 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:14Z","lastTransitionTime":"2025-10-06T13:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.285125 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.285172 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.285182 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.285199 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.285212 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:14Z","lastTransitionTime":"2025-10-06T13:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.331863 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd"] Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.332623 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.335419 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.335831 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.354697 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:11Z\\\",\\\"message\\\":\\\"s-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230527 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230531 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-rssjd in node crc\\\\nI1006 13:04:11.230535 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd after 0 failed attempt(s)\\\\nI1006 13:04:11.230539 6287 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230546 6287 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230552 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230562 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-knnfm in node crc\\\\nI1006 13:04:11.230567 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-knnfm after 0 failed attempt(s)\\\\nF1006 13:04:11.230569 6287 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.371913 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.387213 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.388161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.388203 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.388214 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.388231 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.388244 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:14Z","lastTransitionTime":"2025-10-06T13:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.402589 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.415311 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.426143 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.426301 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c693b796-691d-4cc2-8d01-a0589e8833ef-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-f4ldd\" (UID: \"c693b796-691d-4cc2-8d01-a0589e8833ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.426368 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c693b796-691d-4cc2-8d01-a0589e8833ef-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-f4ldd\" (UID: \"c693b796-691d-4cc2-8d01-a0589e8833ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.426436 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwpkb\" (UniqueName: \"kubernetes.io/projected/c693b796-691d-4cc2-8d01-a0589e8833ef-kube-api-access-fwpkb\") pod \"ovnkube-control-plane-749d76644c-f4ldd\" (UID: \"c693b796-691d-4cc2-8d01-a0589e8833ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.426489 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c693b796-691d-4cc2-8d01-a0589e8833ef-env-overrides\") pod \"ovnkube-control-plane-749d76644c-f4ldd\" (UID: \"c693b796-691d-4cc2-8d01-a0589e8833ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.447238 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.460048 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.483174 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.490156 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.490207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.490217 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.490233 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.490245 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:14Z","lastTransitionTime":"2025-10-06T13:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.498216 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.512117 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.527709 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c693b796-691d-4cc2-8d01-a0589e8833ef-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-f4ldd\" (UID: \"c693b796-691d-4cc2-8d01-a0589e8833ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.527854 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c693b796-691d-4cc2-8d01-a0589e8833ef-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-f4ldd\" (UID: \"c693b796-691d-4cc2-8d01-a0589e8833ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.527927 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwpkb\" (UniqueName: \"kubernetes.io/projected/c693b796-691d-4cc2-8d01-a0589e8833ef-kube-api-access-fwpkb\") pod \"ovnkube-control-plane-749d76644c-f4ldd\" (UID: \"c693b796-691d-4cc2-8d01-a0589e8833ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.528018 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c693b796-691d-4cc2-8d01-a0589e8833ef-env-overrides\") pod \"ovnkube-control-plane-749d76644c-f4ldd\" (UID: \"c693b796-691d-4cc2-8d01-a0589e8833ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.529523 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c693b796-691d-4cc2-8d01-a0589e8833ef-env-overrides\") pod \"ovnkube-control-plane-749d76644c-f4ldd\" (UID: \"c693b796-691d-4cc2-8d01-a0589e8833ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.529738 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c693b796-691d-4cc2-8d01-a0589e8833ef-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-f4ldd\" (UID: \"c693b796-691d-4cc2-8d01-a0589e8833ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.530429 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.538026 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c693b796-691d-4cc2-8d01-a0589e8833ef-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-f4ldd\" (UID: \"c693b796-691d-4cc2-8d01-a0589e8833ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.546646 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwpkb\" (UniqueName: \"kubernetes.io/projected/c693b796-691d-4cc2-8d01-a0589e8833ef-kube-api-access-fwpkb\") pod \"ovnkube-control-plane-749d76644c-f4ldd\" (UID: \"c693b796-691d-4cc2-8d01-a0589e8833ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.548554 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.564003 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.577654 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.593117 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:14Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.593610 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.593663 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.593678 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.593697 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.593711 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:14Z","lastTransitionTime":"2025-10-06T13:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.649647 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" Oct 06 13:04:14 crc kubenswrapper[4867]: W1006 13:04:14.666596 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc693b796_691d_4cc2_8d01_a0589e8833ef.slice/crio-73adf6ae60ff117ea53ea2f657f6e870d1ef57e65f527da35879c2e44c9691b7 WatchSource:0}: Error finding container 73adf6ae60ff117ea53ea2f657f6e870d1ef57e65f527da35879c2e44c9691b7: Status 404 returned error can't find the container with id 73adf6ae60ff117ea53ea2f657f6e870d1ef57e65f527da35879c2e44c9691b7 Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.697225 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.697323 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.697343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.697374 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.697398 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:14Z","lastTransitionTime":"2025-10-06T13:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.800836 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.800923 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.800961 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.801003 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.801029 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:14Z","lastTransitionTime":"2025-10-06T13:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.905124 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.905188 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.905209 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.905238 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:14 crc kubenswrapper[4867]: I1006 13:04:14.905298 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:14Z","lastTransitionTime":"2025-10-06T13:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.007722 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.007777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.007796 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.007823 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.007843 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:15Z","lastTransitionTime":"2025-10-06T13:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.111603 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.111642 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.111653 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.111669 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.111679 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:15Z","lastTransitionTime":"2025-10-06T13:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.214240 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.214301 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.214314 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.214331 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.214368 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:15Z","lastTransitionTime":"2025-10-06T13:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.220935 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.221115 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:15 crc kubenswrapper[4867]: E1006 13:04:15.221209 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.221358 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:15 crc kubenswrapper[4867]: E1006 13:04:15.221615 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:15 crc kubenswrapper[4867]: E1006 13:04:15.221497 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.317300 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.317374 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.317394 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.317421 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.317440 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:15Z","lastTransitionTime":"2025-10-06T13:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.420069 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.420117 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.420126 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.420142 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.420155 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:15Z","lastTransitionTime":"2025-10-06T13:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.465829 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8t2sq"] Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.466897 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:15 crc kubenswrapper[4867]: E1006 13:04:15.467038 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.469658 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" event={"ID":"c693b796-691d-4cc2-8d01-a0589e8833ef","Type":"ContainerStarted","Data":"c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef"} Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.469756 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" event={"ID":"c693b796-691d-4cc2-8d01-a0589e8833ef","Type":"ContainerStarted","Data":"1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257"} Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.469789 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" event={"ID":"c693b796-691d-4cc2-8d01-a0589e8833ef","Type":"ContainerStarted","Data":"73adf6ae60ff117ea53ea2f657f6e870d1ef57e65f527da35879c2e44c9691b7"} Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.498582 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:11Z\\\",\\\"message\\\":\\\"s-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230527 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230531 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-rssjd in node crc\\\\nI1006 13:04:11.230535 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd after 0 failed attempt(s)\\\\nI1006 13:04:11.230539 6287 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230546 6287 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230552 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230562 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-knnfm in node crc\\\\nI1006 13:04:11.230567 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-knnfm after 0 failed attempt(s)\\\\nF1006 13:04:11.230569 6287 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.519960 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.523340 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.523390 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.523402 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.523421 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.523434 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:15Z","lastTransitionTime":"2025-10-06T13:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.536474 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.539398 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs\") pod \"network-metrics-daemon-8t2sq\" (UID: \"b78c9415-85bd-40db-b44f-f1e04797a66e\") " pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.539495 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbqrf\" (UniqueName: \"kubernetes.io/projected/b78c9415-85bd-40db-b44f-f1e04797a66e-kube-api-access-qbqrf\") pod \"network-metrics-daemon-8t2sq\" (UID: \"b78c9415-85bd-40db-b44f-f1e04797a66e\") " pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.552012 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.568602 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.584857 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.600315 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.624015 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.625797 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.625843 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.625855 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.625871 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.625881 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:15Z","lastTransitionTime":"2025-10-06T13:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.636780 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.640472 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs\") pod \"network-metrics-daemon-8t2sq\" (UID: \"b78c9415-85bd-40db-b44f-f1e04797a66e\") " pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.640534 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbqrf\" (UniqueName: \"kubernetes.io/projected/b78c9415-85bd-40db-b44f-f1e04797a66e-kube-api-access-qbqrf\") pod \"network-metrics-daemon-8t2sq\" (UID: \"b78c9415-85bd-40db-b44f-f1e04797a66e\") " pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:15 crc kubenswrapper[4867]: E1006 13:04:15.641033 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:04:15 crc kubenswrapper[4867]: E1006 13:04:15.641140 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs podName:b78c9415-85bd-40db-b44f-f1e04797a66e nodeName:}" failed. No retries permitted until 2025-10-06 13:04:16.141111731 +0000 UTC m=+35.599059875 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs") pod "network-metrics-daemon-8t2sq" (UID: "b78c9415-85bd-40db-b44f-f1e04797a66e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.663783 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbqrf\" (UniqueName: \"kubernetes.io/projected/b78c9415-85bd-40db-b44f-f1e04797a66e-kube-api-access-qbqrf\") pod \"network-metrics-daemon-8t2sq\" (UID: \"b78c9415-85bd-40db-b44f-f1e04797a66e\") " pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.664128 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.686225 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.707975 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.723987 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.729127 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.729182 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.729200 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.729224 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.729241 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:15Z","lastTransitionTime":"2025-10-06T13:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.743874 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.759562 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.800382 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.813522 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.831669 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.831720 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.831730 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.831747 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.831759 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:15Z","lastTransitionTime":"2025-10-06T13:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.832025 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.845797 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.858859 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.873737 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.889485 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.908282 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.919595 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.933942 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.933996 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.934006 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.934019 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.934027 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:15Z","lastTransitionTime":"2025-10-06T13:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.940207 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.953403 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.967158 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.981804 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:15 crc kubenswrapper[4867]: I1006 13:04:15.995513 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:15Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.011693 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:16Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.028063 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:16Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.037730 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.037762 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.037772 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.037785 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.037795 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:16Z","lastTransitionTime":"2025-10-06T13:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.049830 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:16Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.081119 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:11Z\\\",\\\"message\\\":\\\"s-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230527 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230531 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-rssjd in node crc\\\\nI1006 13:04:11.230535 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd after 0 failed attempt(s)\\\\nI1006 13:04:11.230539 6287 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230546 6287 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230552 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230562 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-knnfm in node crc\\\\nI1006 13:04:11.230567 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-knnfm after 0 failed attempt(s)\\\\nF1006 13:04:11.230569 6287 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:16Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.095307 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:16Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.140887 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.140943 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.140955 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.140978 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.140993 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:16Z","lastTransitionTime":"2025-10-06T13:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.144538 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs\") pod \"network-metrics-daemon-8t2sq\" (UID: \"b78c9415-85bd-40db-b44f-f1e04797a66e\") " pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.144763 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.144863 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs podName:b78c9415-85bd-40db-b44f-f1e04797a66e nodeName:}" failed. No retries permitted until 2025-10-06 13:04:17.144833045 +0000 UTC m=+36.602781199 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs") pod "network-metrics-daemon-8t2sq" (UID: "b78c9415-85bd-40db-b44f-f1e04797a66e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.244957 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.245024 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.245037 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.245058 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.245074 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:16Z","lastTransitionTime":"2025-10-06T13:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.348069 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.348105 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.348115 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.348129 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.348139 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:16Z","lastTransitionTime":"2025-10-06T13:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.450580 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.450624 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.450637 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.450654 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.450666 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:16Z","lastTransitionTime":"2025-10-06T13:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.553681 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.553732 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.553755 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.553772 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.553784 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:16Z","lastTransitionTime":"2025-10-06T13:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.656578 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.656623 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.656633 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.656646 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.656656 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:16Z","lastTransitionTime":"2025-10-06T13:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.705332 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.706986 4867 scope.go:117] "RemoveContainer" containerID="327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d" Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.707518 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.760325 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.760398 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.760414 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.760436 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.760450 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:16Z","lastTransitionTime":"2025-10-06T13:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.862632 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.862678 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.862688 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.862706 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.862719 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:16Z","lastTransitionTime":"2025-10-06T13:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.953246 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.953416 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:04:32.953388135 +0000 UTC m=+52.411336279 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.953472 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.953534 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.953584 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.953647 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:32.953629731 +0000 UTC m=+52.411577885 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.953590 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.953681 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.953695 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.953698 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.953707 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.953749 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.953777 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:32.953768274 +0000 UTC m=+52.411716428 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.953788 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.953839 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.953854 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.953795 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:32.953787035 +0000 UTC m=+52.411735189 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:04:16 crc kubenswrapper[4867]: E1006 13:04:16.953939 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 13:04:32.953918828 +0000 UTC m=+52.411866972 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.964765 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.964823 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.964842 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.964868 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:16 crc kubenswrapper[4867]: I1006 13:04:16.964886 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:16Z","lastTransitionTime":"2025-10-06T13:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.066902 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.067082 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.067095 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.067109 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.067120 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:17Z","lastTransitionTime":"2025-10-06T13:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.155523 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs\") pod \"network-metrics-daemon-8t2sq\" (UID: \"b78c9415-85bd-40db-b44f-f1e04797a66e\") " pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:17 crc kubenswrapper[4867]: E1006 13:04:17.155703 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:04:17 crc kubenswrapper[4867]: E1006 13:04:17.155799 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs podName:b78c9415-85bd-40db-b44f-f1e04797a66e nodeName:}" failed. No retries permitted until 2025-10-06 13:04:19.155773931 +0000 UTC m=+38.613722105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs") pod "network-metrics-daemon-8t2sq" (UID: "b78c9415-85bd-40db-b44f-f1e04797a66e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.161327 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.161386 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.161407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.161434 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.161452 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:17Z","lastTransitionTime":"2025-10-06T13:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: E1006 13:04:17.175994 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:17Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.180770 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.180803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.180814 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.180831 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.180844 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:17Z","lastTransitionTime":"2025-10-06T13:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: E1006 13:04:17.194328 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:17Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.198346 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.198403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.198421 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.198444 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.198461 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:17Z","lastTransitionTime":"2025-10-06T13:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: E1006 13:04:17.212124 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:17Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.216750 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.216788 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.216800 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.216816 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.216828 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:17Z","lastTransitionTime":"2025-10-06T13:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.220543 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.220578 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.220582 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.220695 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:17 crc kubenswrapper[4867]: E1006 13:04:17.220735 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:17 crc kubenswrapper[4867]: E1006 13:04:17.220885 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:17 crc kubenswrapper[4867]: E1006 13:04:17.220982 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:17 crc kubenswrapper[4867]: E1006 13:04:17.221041 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:17 crc kubenswrapper[4867]: E1006 13:04:17.233685 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:17Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.237837 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.237889 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.237901 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.237923 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.237939 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:17Z","lastTransitionTime":"2025-10-06T13:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: E1006 13:04:17.257045 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:17Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:17 crc kubenswrapper[4867]: E1006 13:04:17.257307 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.259306 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.259349 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.259361 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.259404 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.259417 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:17Z","lastTransitionTime":"2025-10-06T13:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.362219 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.362310 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.362322 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.362341 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.362353 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:17Z","lastTransitionTime":"2025-10-06T13:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.465396 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.465468 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.465480 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.465497 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.465511 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:17Z","lastTransitionTime":"2025-10-06T13:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.569517 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.569564 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.569573 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.569645 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.569657 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:17Z","lastTransitionTime":"2025-10-06T13:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.673067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.673134 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.673147 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.673163 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.673173 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:17Z","lastTransitionTime":"2025-10-06T13:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.776583 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.776692 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.776714 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.776746 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.776767 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:17Z","lastTransitionTime":"2025-10-06T13:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.880497 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.880581 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.880620 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.880657 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.880683 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:17Z","lastTransitionTime":"2025-10-06T13:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.947822 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.962536 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:17Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.984563 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.984626 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.984637 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.984657 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.984676 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:17Z","lastTransitionTime":"2025-10-06T13:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:17 crc kubenswrapper[4867]: I1006 13:04:17.987102 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:17Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.006515 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.025664 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.043373 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.060354 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.073825 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.088506 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.088579 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.088590 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.088609 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.088622 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:18Z","lastTransitionTime":"2025-10-06T13:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.095941 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.110353 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.124919 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.148639 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.166097 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.187295 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.192087 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.192131 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.192145 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.192184 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.192198 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:18Z","lastTransitionTime":"2025-10-06T13:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.204538 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.219330 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.249217 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:11Z\\\",\\\"message\\\":\\\"s-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230527 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230531 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-rssjd in node crc\\\\nI1006 13:04:11.230535 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd after 0 failed attempt(s)\\\\nI1006 13:04:11.230539 6287 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230546 6287 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230552 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230562 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-knnfm in node crc\\\\nI1006 13:04:11.230567 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-knnfm after 0 failed attempt(s)\\\\nF1006 13:04:11.230569 6287 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.267419 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.295282 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.295324 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.295351 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.295368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.295379 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:18Z","lastTransitionTime":"2025-10-06T13:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.398175 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.398219 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.398229 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.398245 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.398269 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:18Z","lastTransitionTime":"2025-10-06T13:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.501053 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.501085 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.501093 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.501109 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.501118 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:18Z","lastTransitionTime":"2025-10-06T13:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.604692 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.604759 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.604779 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.604803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.604822 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:18Z","lastTransitionTime":"2025-10-06T13:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.708498 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.708552 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.708563 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.708581 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.708593 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:18Z","lastTransitionTime":"2025-10-06T13:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.812231 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.812324 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.812344 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.812375 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.812399 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:18Z","lastTransitionTime":"2025-10-06T13:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.915647 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.915726 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.915748 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.915778 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:18 crc kubenswrapper[4867]: I1006 13:04:18.915801 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:18Z","lastTransitionTime":"2025-10-06T13:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.019162 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.019237 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.019307 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.019338 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.019359 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:19Z","lastTransitionTime":"2025-10-06T13:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.122905 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.122983 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.123009 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.123047 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.123072 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:19Z","lastTransitionTime":"2025-10-06T13:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.177922 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs\") pod \"network-metrics-daemon-8t2sq\" (UID: \"b78c9415-85bd-40db-b44f-f1e04797a66e\") " pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:19 crc kubenswrapper[4867]: E1006 13:04:19.178162 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:04:19 crc kubenswrapper[4867]: E1006 13:04:19.178373 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs podName:b78c9415-85bd-40db-b44f-f1e04797a66e nodeName:}" failed. No retries permitted until 2025-10-06 13:04:23.178333358 +0000 UTC m=+42.636281542 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs") pod "network-metrics-daemon-8t2sq" (UID: "b78c9415-85bd-40db-b44f-f1e04797a66e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.221056 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.221739 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:19 crc kubenswrapper[4867]: E1006 13:04:19.222026 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.222597 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:19 crc kubenswrapper[4867]: E1006 13:04:19.223245 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:19 crc kubenswrapper[4867]: E1006 13:04:19.223555 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.224245 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:19 crc kubenswrapper[4867]: E1006 13:04:19.224527 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.228338 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.228386 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.228397 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.228416 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.228429 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:19Z","lastTransitionTime":"2025-10-06T13:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.331536 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.331599 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.331610 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.331625 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.331663 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:19Z","lastTransitionTime":"2025-10-06T13:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.434224 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.434295 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.434311 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.434332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.434349 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:19Z","lastTransitionTime":"2025-10-06T13:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.536842 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.536915 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.536927 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.536946 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.536958 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:19Z","lastTransitionTime":"2025-10-06T13:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.639145 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.639420 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.639505 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.639594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.639657 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:19Z","lastTransitionTime":"2025-10-06T13:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.741729 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.741799 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.741809 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.741828 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.741837 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:19Z","lastTransitionTime":"2025-10-06T13:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.843757 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.843822 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.843840 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.843863 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.843881 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:19Z","lastTransitionTime":"2025-10-06T13:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.946939 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.947022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.947039 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.947057 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:19 crc kubenswrapper[4867]: I1006 13:04:19.947070 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:19Z","lastTransitionTime":"2025-10-06T13:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.049893 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.050367 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.050447 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.050536 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.050615 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:20Z","lastTransitionTime":"2025-10-06T13:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.154000 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.154059 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.154075 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.154099 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.154119 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:20Z","lastTransitionTime":"2025-10-06T13:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.257225 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.257579 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.257675 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.257779 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.257880 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:20Z","lastTransitionTime":"2025-10-06T13:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.362055 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.362152 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.362176 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.362213 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.362242 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:20Z","lastTransitionTime":"2025-10-06T13:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.464955 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.465022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.465040 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.465068 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.465091 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:20Z","lastTransitionTime":"2025-10-06T13:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.568146 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.568302 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.568327 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.568360 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.568383 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:20Z","lastTransitionTime":"2025-10-06T13:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.671178 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.671227 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.671236 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.671266 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.671277 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:20Z","lastTransitionTime":"2025-10-06T13:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.775138 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.775213 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.775228 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.775280 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.775300 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:20Z","lastTransitionTime":"2025-10-06T13:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.878023 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.878081 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.878094 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.878114 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.878125 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:20Z","lastTransitionTime":"2025-10-06T13:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.981561 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.981626 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.981640 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.981663 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:20 crc kubenswrapper[4867]: I1006 13:04:20.981676 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:20Z","lastTransitionTime":"2025-10-06T13:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.084969 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.085025 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.085036 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.085057 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.085069 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:21Z","lastTransitionTime":"2025-10-06T13:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.188276 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.188342 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.188361 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.188389 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.188410 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:21Z","lastTransitionTime":"2025-10-06T13:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.221273 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.221309 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.221385 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:21 crc kubenswrapper[4867]: E1006 13:04:21.221448 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.221289 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:21 crc kubenswrapper[4867]: E1006 13:04:21.221581 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:21 crc kubenswrapper[4867]: E1006 13:04:21.221683 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:21 crc kubenswrapper[4867]: E1006 13:04:21.221766 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.232646 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.257380 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.278494 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.291530 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.291596 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.291613 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.291640 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.291661 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:21Z","lastTransitionTime":"2025-10-06T13:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.295096 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.311381 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.331307 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.343637 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.362413 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.381168 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.394632 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.394680 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.394694 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.394724 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.394739 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:21Z","lastTransitionTime":"2025-10-06T13:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.399051 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.421215 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.438115 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.454727 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.476860 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.493494 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.496913 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.496966 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.496981 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.497002 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.497017 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:21Z","lastTransitionTime":"2025-10-06T13:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.526003 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:11Z\\\",\\\"message\\\":\\\"s-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230527 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230531 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-rssjd in node crc\\\\nI1006 13:04:11.230535 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd after 0 failed attempt(s)\\\\nI1006 13:04:11.230539 6287 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230546 6287 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230552 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230562 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-knnfm in node crc\\\\nI1006 13:04:11.230567 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-knnfm after 0 failed attempt(s)\\\\nF1006 13:04:11.230569 6287 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.539553 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:21Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.599284 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.599335 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.599345 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.599363 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.599373 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:21Z","lastTransitionTime":"2025-10-06T13:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.701679 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.701762 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.701782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.701814 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.701836 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:21Z","lastTransitionTime":"2025-10-06T13:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.804994 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.805057 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.805069 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.805092 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.805108 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:21Z","lastTransitionTime":"2025-10-06T13:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.908523 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.908577 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.908593 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.908614 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:21 crc kubenswrapper[4867]: I1006 13:04:21.908629 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:21Z","lastTransitionTime":"2025-10-06T13:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.011601 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.011650 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.011665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.011686 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.011699 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:22Z","lastTransitionTime":"2025-10-06T13:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.115678 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.115782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.115809 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.115845 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.115868 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:22Z","lastTransitionTime":"2025-10-06T13:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.219760 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.219836 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.219848 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.219867 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.219880 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:22Z","lastTransitionTime":"2025-10-06T13:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.321938 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.322013 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.322022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.322041 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.322052 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:22Z","lastTransitionTime":"2025-10-06T13:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.425165 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.425226 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.425242 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.425287 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.425304 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:22Z","lastTransitionTime":"2025-10-06T13:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.528299 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.528362 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.528381 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.528402 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.528417 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:22Z","lastTransitionTime":"2025-10-06T13:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.630742 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.630861 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.630877 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.630900 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.630910 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:22Z","lastTransitionTime":"2025-10-06T13:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.734476 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.734530 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.734542 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.734563 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.734576 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:22Z","lastTransitionTime":"2025-10-06T13:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.837216 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.837285 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.837300 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.837319 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.837332 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:22Z","lastTransitionTime":"2025-10-06T13:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.940039 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.940086 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.940096 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.940112 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:22 crc kubenswrapper[4867]: I1006 13:04:22.940122 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:22Z","lastTransitionTime":"2025-10-06T13:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.042632 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.042694 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.042709 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.042723 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.042743 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:23Z","lastTransitionTime":"2025-10-06T13:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.144935 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.144986 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.144999 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.145016 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.145028 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:23Z","lastTransitionTime":"2025-10-06T13:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.220242 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.220303 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.220314 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.220312 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:23 crc kubenswrapper[4867]: E1006 13:04:23.220411 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:23 crc kubenswrapper[4867]: E1006 13:04:23.220521 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:23 crc kubenswrapper[4867]: E1006 13:04:23.220619 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:23 crc kubenswrapper[4867]: E1006 13:04:23.220728 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.226459 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs\") pod \"network-metrics-daemon-8t2sq\" (UID: \"b78c9415-85bd-40db-b44f-f1e04797a66e\") " pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:23 crc kubenswrapper[4867]: E1006 13:04:23.226646 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:04:23 crc kubenswrapper[4867]: E1006 13:04:23.226695 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs podName:b78c9415-85bd-40db-b44f-f1e04797a66e nodeName:}" failed. No retries permitted until 2025-10-06 13:04:31.226679939 +0000 UTC m=+50.684628083 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs") pod "network-metrics-daemon-8t2sq" (UID: "b78c9415-85bd-40db-b44f-f1e04797a66e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.247571 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.247604 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.247619 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.247635 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.247653 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:23Z","lastTransitionTime":"2025-10-06T13:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.350041 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.350093 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.350104 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.350120 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.350176 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:23Z","lastTransitionTime":"2025-10-06T13:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.453033 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.453076 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.453090 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.453111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.453125 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:23Z","lastTransitionTime":"2025-10-06T13:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.556101 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.556157 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.556171 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.556190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.556205 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:23Z","lastTransitionTime":"2025-10-06T13:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.658916 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.658961 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.658973 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.658990 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.659002 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:23Z","lastTransitionTime":"2025-10-06T13:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.761566 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.761620 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.761637 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.761657 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.761673 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:23Z","lastTransitionTime":"2025-10-06T13:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.863752 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.863790 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.863799 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.863814 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.863825 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:23Z","lastTransitionTime":"2025-10-06T13:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.966507 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.966557 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.966567 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.966596 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:23 crc kubenswrapper[4867]: I1006 13:04:23.966609 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:23Z","lastTransitionTime":"2025-10-06T13:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.068811 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.068868 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.068880 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.068899 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.068911 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:24Z","lastTransitionTime":"2025-10-06T13:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.171452 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.171495 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.171506 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.171520 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.171530 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:24Z","lastTransitionTime":"2025-10-06T13:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.274026 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.274067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.274079 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.274093 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.274104 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:24Z","lastTransitionTime":"2025-10-06T13:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.376076 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.376110 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.376120 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.376132 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.376140 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:24Z","lastTransitionTime":"2025-10-06T13:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.478431 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.478461 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.478469 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.478481 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.478492 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:24Z","lastTransitionTime":"2025-10-06T13:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.580834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.580905 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.580929 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.581022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.581133 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:24Z","lastTransitionTime":"2025-10-06T13:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.683687 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.683745 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.683756 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.683777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.683788 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:24Z","lastTransitionTime":"2025-10-06T13:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.787438 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.787518 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.787534 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.787562 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.787580 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:24Z","lastTransitionTime":"2025-10-06T13:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.891163 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.891230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.891286 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.891317 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.891340 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:24Z","lastTransitionTime":"2025-10-06T13:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.994964 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.995025 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.995047 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.995072 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:24 crc kubenswrapper[4867]: I1006 13:04:24.995090 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:24Z","lastTransitionTime":"2025-10-06T13:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.097931 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.098002 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.098018 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.098039 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.098055 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:25Z","lastTransitionTime":"2025-10-06T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.201284 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.201356 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.201376 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.201402 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.201421 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:25Z","lastTransitionTime":"2025-10-06T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.220699 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.220855 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.220770 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.220965 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:25 crc kubenswrapper[4867]: E1006 13:04:25.221012 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:25 crc kubenswrapper[4867]: E1006 13:04:25.221204 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:25 crc kubenswrapper[4867]: E1006 13:04:25.221395 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:25 crc kubenswrapper[4867]: E1006 13:04:25.221530 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.305277 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.305332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.305347 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.305368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.305383 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:25Z","lastTransitionTime":"2025-10-06T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.408929 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.408968 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.408976 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.408994 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.409006 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:25Z","lastTransitionTime":"2025-10-06T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.512357 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.512434 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.512481 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.512513 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.512532 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:25Z","lastTransitionTime":"2025-10-06T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.614598 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.614638 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.614649 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.614667 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.614680 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:25Z","lastTransitionTime":"2025-10-06T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.717351 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.717397 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.717409 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.717426 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.717440 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:25Z","lastTransitionTime":"2025-10-06T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.820190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.820233 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.820281 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.820301 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.820313 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:25Z","lastTransitionTime":"2025-10-06T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.923056 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.923100 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.923109 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.923125 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:25 crc kubenswrapper[4867]: I1006 13:04:25.923135 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:25Z","lastTransitionTime":"2025-10-06T13:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.025200 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.025248 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.025291 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.025310 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.025322 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:26Z","lastTransitionTime":"2025-10-06T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.128196 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.128241 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.128270 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.128286 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.128300 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:26Z","lastTransitionTime":"2025-10-06T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.230851 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.230898 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.230909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.230926 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.230938 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:26Z","lastTransitionTime":"2025-10-06T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.334092 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.334158 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.334170 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.334195 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.334210 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:26Z","lastTransitionTime":"2025-10-06T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.438018 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.438115 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.438130 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.438156 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.438171 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:26Z","lastTransitionTime":"2025-10-06T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.542472 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.542570 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.542590 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.542620 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.542645 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:26Z","lastTransitionTime":"2025-10-06T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.646135 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.646230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.646285 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.646322 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.646345 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:26Z","lastTransitionTime":"2025-10-06T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.749790 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.749879 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.749900 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.749942 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.749963 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:26Z","lastTransitionTime":"2025-10-06T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.853190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.853332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.853351 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.853380 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.853398 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:26Z","lastTransitionTime":"2025-10-06T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.956036 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.956100 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.956112 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.956131 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:26 crc kubenswrapper[4867]: I1006 13:04:26.956143 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:26Z","lastTransitionTime":"2025-10-06T13:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.059827 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.059879 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.059889 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.059907 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.059917 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:27Z","lastTransitionTime":"2025-10-06T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.163107 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.163155 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.163164 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.163180 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.163191 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:27Z","lastTransitionTime":"2025-10-06T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.221189 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.221343 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.221399 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.221407 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:27 crc kubenswrapper[4867]: E1006 13:04:27.221600 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:27 crc kubenswrapper[4867]: E1006 13:04:27.221733 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:27 crc kubenswrapper[4867]: E1006 13:04:27.221845 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:27 crc kubenswrapper[4867]: E1006 13:04:27.222095 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.266229 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.266313 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.266328 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.266351 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.266364 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:27Z","lastTransitionTime":"2025-10-06T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.369643 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.369726 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.369749 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.369781 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.369802 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:27Z","lastTransitionTime":"2025-10-06T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.472782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.472834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.472843 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.472858 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.472903 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:27Z","lastTransitionTime":"2025-10-06T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.491047 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.491119 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.491138 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.491166 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.491184 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:27Z","lastTransitionTime":"2025-10-06T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:27 crc kubenswrapper[4867]: E1006 13:04:27.508149 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:27Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.517947 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.517998 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.518013 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.518031 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.518043 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:27Z","lastTransitionTime":"2025-10-06T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:27 crc kubenswrapper[4867]: E1006 13:04:27.533296 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:27Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.539011 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.539077 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.539105 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.539140 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.539162 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:27Z","lastTransitionTime":"2025-10-06T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:27 crc kubenswrapper[4867]: E1006 13:04:27.559755 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:27Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.565374 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.565441 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.565458 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.565480 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.565500 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:27Z","lastTransitionTime":"2025-10-06T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:27 crc kubenswrapper[4867]: E1006 13:04:27.587031 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:27Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.592925 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.593010 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.593031 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.593062 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.593085 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:27Z","lastTransitionTime":"2025-10-06T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:27 crc kubenswrapper[4867]: E1006 13:04:27.614881 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:27Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:27 crc kubenswrapper[4867]: E1006 13:04:27.615111 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.617072 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.617119 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.617129 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.617148 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.617167 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:27Z","lastTransitionTime":"2025-10-06T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.721053 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.721148 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.721179 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.721215 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.721309 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:27Z","lastTransitionTime":"2025-10-06T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.824568 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.824674 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.824711 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.824749 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.824775 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:27Z","lastTransitionTime":"2025-10-06T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.928037 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.928118 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.928146 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.928181 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:27 crc kubenswrapper[4867]: I1006 13:04:27.928204 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:27Z","lastTransitionTime":"2025-10-06T13:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.032309 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.032412 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.032431 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.032461 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.032482 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:28Z","lastTransitionTime":"2025-10-06T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.135484 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.135576 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.135595 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.135625 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.135646 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:28Z","lastTransitionTime":"2025-10-06T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.239459 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.239557 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.239587 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.239622 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.239648 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:28Z","lastTransitionTime":"2025-10-06T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.342099 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.342147 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.342158 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.342177 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.342187 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:28Z","lastTransitionTime":"2025-10-06T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.444202 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.444245 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.444271 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.444288 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.444299 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:28Z","lastTransitionTime":"2025-10-06T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.547063 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.547137 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.547162 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.547192 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.547212 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:28Z","lastTransitionTime":"2025-10-06T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.650616 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.650664 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.650677 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.650695 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.650708 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:28Z","lastTransitionTime":"2025-10-06T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.752958 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.752995 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.753008 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.753024 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.753038 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:28Z","lastTransitionTime":"2025-10-06T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.856125 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.856221 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.856246 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.856319 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.856346 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:28Z","lastTransitionTime":"2025-10-06T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.959373 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.959449 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.959469 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.959498 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:28 crc kubenswrapper[4867]: I1006 13:04:28.959519 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:28Z","lastTransitionTime":"2025-10-06T13:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.062766 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.062847 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.062862 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.062885 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.062901 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:29Z","lastTransitionTime":"2025-10-06T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.166959 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.167019 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.167029 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.167046 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.167061 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:29Z","lastTransitionTime":"2025-10-06T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.220968 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.221160 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.221155 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:29 crc kubenswrapper[4867]: E1006 13:04:29.221403 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.221432 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:29 crc kubenswrapper[4867]: E1006 13:04:29.221584 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:29 crc kubenswrapper[4867]: E1006 13:04:29.221633 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:29 crc kubenswrapper[4867]: E1006 13:04:29.221751 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.271407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.271470 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.271489 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.271522 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.271546 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:29Z","lastTransitionTime":"2025-10-06T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.374564 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.374631 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.374688 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.374717 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.374736 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:29Z","lastTransitionTime":"2025-10-06T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.478408 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.478467 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.478482 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.478505 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.478520 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:29Z","lastTransitionTime":"2025-10-06T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.582360 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.582443 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.582466 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.582496 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.582518 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:29Z","lastTransitionTime":"2025-10-06T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.686032 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.686089 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.686099 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.686115 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.686126 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:29Z","lastTransitionTime":"2025-10-06T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.788562 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.788637 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.788657 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.788684 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.788704 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:29Z","lastTransitionTime":"2025-10-06T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.891866 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.891944 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.891963 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.891992 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.892012 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:29Z","lastTransitionTime":"2025-10-06T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.994907 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.994972 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.994983 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.995035 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:29 crc kubenswrapper[4867]: I1006 13:04:29.995048 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:29Z","lastTransitionTime":"2025-10-06T13:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.098055 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.098141 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.098158 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.098189 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.098207 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:30Z","lastTransitionTime":"2025-10-06T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.200458 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.200541 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.200563 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.200593 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.200615 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:30Z","lastTransitionTime":"2025-10-06T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.302992 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.303037 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.303049 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.303067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.303080 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:30Z","lastTransitionTime":"2025-10-06T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.406231 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.406315 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.406334 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.406359 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.406374 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:30Z","lastTransitionTime":"2025-10-06T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.508845 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.508909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.508926 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.508955 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.508976 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:30Z","lastTransitionTime":"2025-10-06T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.611885 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.611932 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.611940 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.611958 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.611971 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:30Z","lastTransitionTime":"2025-10-06T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.714537 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.714574 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.714585 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.714599 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.714609 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:30Z","lastTransitionTime":"2025-10-06T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.817410 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.817462 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.817472 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.817485 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.817495 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:30Z","lastTransitionTime":"2025-10-06T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.920238 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.920313 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.920325 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.920343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:30 crc kubenswrapper[4867]: I1006 13:04:30.920354 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:30Z","lastTransitionTime":"2025-10-06T13:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.022622 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.022663 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.022671 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.022685 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.022693 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:31Z","lastTransitionTime":"2025-10-06T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.125037 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.125098 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.125112 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.125135 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.125150 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:31Z","lastTransitionTime":"2025-10-06T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.220991 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:31 crc kubenswrapper[4867]: E1006 13:04:31.221168 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.221632 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.221713 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.221624 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:31 crc kubenswrapper[4867]: E1006 13:04:31.221968 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:31 crc kubenswrapper[4867]: E1006 13:04:31.222436 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:31 crc kubenswrapper[4867]: E1006 13:04:31.222505 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.224392 4867 scope.go:117] "RemoveContainer" containerID="327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.227006 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.227045 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.227057 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.227075 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.227088 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:31Z","lastTransitionTime":"2025-10-06T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.238199 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.240873 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs\") pod \"network-metrics-daemon-8t2sq\" (UID: \"b78c9415-85bd-40db-b44f-f1e04797a66e\") " pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:31 crc kubenswrapper[4867]: E1006 13:04:31.241015 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:04:31 crc kubenswrapper[4867]: E1006 13:04:31.241069 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs podName:b78c9415-85bd-40db-b44f-f1e04797a66e nodeName:}" failed. No retries permitted until 2025-10-06 13:04:47.241052167 +0000 UTC m=+66.699000301 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs") pod "network-metrics-daemon-8t2sq" (UID: "b78c9415-85bd-40db-b44f-f1e04797a66e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.253790 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.286456 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.310008 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.329885 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.329938 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.329952 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.329974 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.329990 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:31Z","lastTransitionTime":"2025-10-06T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.350095 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:11Z\\\",\\\"message\\\":\\\"s-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230527 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230531 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-rssjd in node crc\\\\nI1006 13:04:11.230535 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd after 0 failed attempt(s)\\\\nI1006 13:04:11.230539 6287 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230546 6287 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230552 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230562 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-knnfm in node crc\\\\nI1006 13:04:11.230567 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-knnfm after 0 failed attempt(s)\\\\nF1006 13:04:11.230569 6287 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.368317 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.383583 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.395502 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.414043 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.428270 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.432671 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.432706 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.432717 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.432734 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.432746 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:31Z","lastTransitionTime":"2025-10-06T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.452554 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.471537 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.487197 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.503529 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.519152 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.538223 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.538269 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.538278 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.538293 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.538303 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:31Z","lastTransitionTime":"2025-10-06T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.539511 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.539741 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/1.log" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.542887 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerStarted","Data":"b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58"} Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.543346 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.559069 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.573913 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.601187 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:11Z\\\",\\\"message\\\":\\\"s-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230527 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230531 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-rssjd in node crc\\\\nI1006 13:04:11.230535 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd after 0 failed attempt(s)\\\\nI1006 13:04:11.230539 6287 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230546 6287 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230552 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230562 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-knnfm in node crc\\\\nI1006 13:04:11.230567 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-knnfm after 0 failed attempt(s)\\\\nF1006 13:04:11.230569 6287 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.616022 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.632818 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.641286 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.641346 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.641357 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.641379 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.641393 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:31Z","lastTransitionTime":"2025-10-06T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.649210 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.666891 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.682138 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.697680 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.718448 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.731728 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.743740 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.743785 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.743794 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.743808 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.743820 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:31Z","lastTransitionTime":"2025-10-06T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.756949 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.775564 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.794483 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.810279 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.826434 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.841980 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.845682 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.845713 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.845721 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.845735 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.845746 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:31Z","lastTransitionTime":"2025-10-06T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.858617 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:31Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.948543 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.948594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.948604 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.948622 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:31 crc kubenswrapper[4867]: I1006 13:04:31.948635 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:31Z","lastTransitionTime":"2025-10-06T13:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.051747 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.051809 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.051821 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.051843 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.051858 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:32Z","lastTransitionTime":"2025-10-06T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.154408 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.154469 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.154484 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.154504 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.154518 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:32Z","lastTransitionTime":"2025-10-06T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.257267 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.257313 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.257323 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.257343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.257353 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:32Z","lastTransitionTime":"2025-10-06T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.360964 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.361019 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.361028 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.361053 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.361064 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:32Z","lastTransitionTime":"2025-10-06T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.463693 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.463744 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.463756 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.463781 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.463795 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:32Z","lastTransitionTime":"2025-10-06T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.548037 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/2.log" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.548846 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/1.log" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.551479 4867 generic.go:334] "Generic (PLEG): container finished" podID="93569a52-4f36-4017-9834-b3651d6cd63e" containerID="b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58" exitCode=1 Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.551517 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerDied","Data":"b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58"} Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.551546 4867 scope.go:117] "RemoveContainer" containerID="327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.552207 4867 scope.go:117] "RemoveContainer" containerID="b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58" Oct 06 13:04:32 crc kubenswrapper[4867]: E1006 13:04:32.552418 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.567234 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.567450 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.567483 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.567570 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.567611 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:32Z","lastTransitionTime":"2025-10-06T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.573223 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.589235 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.623394 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327fdd8c18fa62ea402eefc815505a8a8262a12fd7372cd11a2fead8e0b8255d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:11Z\\\",\\\"message\\\":\\\"s-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230527 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230531 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-rssjd in node crc\\\\nI1006 13:04:11.230535 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-rssjd after 0 failed attempt(s)\\\\nI1006 13:04:11.230539 6287 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-rssjd\\\\nI1006 13:04:11.230546 6287 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230552 6287 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-knnfm\\\\nI1006 13:04:11.230562 6287 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-knnfm in node crc\\\\nI1006 13:04:11.230567 6287 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-knnfm after 0 failed attempt(s)\\\\nF1006 13:04:11.230569 6287 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:32Z\\\",\\\"message\\\":\\\"mon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 13:04:32.221806 6511 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 13:04:32.221820 6511 tr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.637605 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.653757 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.670136 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.670812 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.670859 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.670873 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.670891 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.670904 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:32Z","lastTransitionTime":"2025-10-06T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.687480 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.704933 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.723869 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.740556 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.754850 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.774309 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.774387 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.774406 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.774435 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.774453 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:32Z","lastTransitionTime":"2025-10-06T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.778958 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.795356 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.813247 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.825985 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.843034 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.860455 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:32Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.877717 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.877763 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.877773 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.877788 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.877798 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:32Z","lastTransitionTime":"2025-10-06T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.960630 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.960789 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.960830 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:32 crc kubenswrapper[4867]: E1006 13:04:32.960857 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:05:04.9608298 +0000 UTC m=+84.418777944 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.960909 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:32 crc kubenswrapper[4867]: E1006 13:04:32.960949 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.960978 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:32 crc kubenswrapper[4867]: E1006 13:04:32.961014 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:05:04.960996574 +0000 UTC m=+84.418944708 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:04:32 crc kubenswrapper[4867]: E1006 13:04:32.961153 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:04:32 crc kubenswrapper[4867]: E1006 13:04:32.961167 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:04:32 crc kubenswrapper[4867]: E1006 13:04:32.961178 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:32 crc kubenswrapper[4867]: E1006 13:04:32.961201 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:04:32 crc kubenswrapper[4867]: E1006 13:04:32.961297 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:04:32 crc kubenswrapper[4867]: E1006 13:04:32.961312 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:04:32 crc kubenswrapper[4867]: E1006 13:04:32.961485 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:32 crc kubenswrapper[4867]: E1006 13:04:32.961222 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 13:05:04.961214399 +0000 UTC m=+84.419162543 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:32 crc kubenswrapper[4867]: E1006 13:04:32.961545 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:05:04.961533537 +0000 UTC m=+84.419481801 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:04:32 crc kubenswrapper[4867]: E1006 13:04:32.961560 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 13:05:04.961553987 +0000 UTC m=+84.419502251 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.981492 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.981534 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.981544 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.981560 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:32 crc kubenswrapper[4867]: I1006 13:04:32.981574 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:32Z","lastTransitionTime":"2025-10-06T13:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.084965 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.085060 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.085087 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.085123 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.085149 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:33Z","lastTransitionTime":"2025-10-06T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.189133 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.189190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.189203 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.189224 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.189268 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:33Z","lastTransitionTime":"2025-10-06T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.221081 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.221145 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.221160 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:33 crc kubenswrapper[4867]: E1006 13:04:33.221365 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.221466 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:33 crc kubenswrapper[4867]: E1006 13:04:33.221643 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:33 crc kubenswrapper[4867]: E1006 13:04:33.221805 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:33 crc kubenswrapper[4867]: E1006 13:04:33.221908 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.292953 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.293028 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.293045 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.293071 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.293092 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:33Z","lastTransitionTime":"2025-10-06T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.397431 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.397510 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.397530 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.397560 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.397586 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:33Z","lastTransitionTime":"2025-10-06T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.502149 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.502227 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.502246 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.502305 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.502326 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:33Z","lastTransitionTime":"2025-10-06T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.560341 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/2.log" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.565732 4867 scope.go:117] "RemoveContainer" containerID="b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58" Oct 06 13:04:33 crc kubenswrapper[4867]: E1006 13:04:33.565927 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.584443 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.605829 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.605902 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.605914 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.605936 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.605948 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:33Z","lastTransitionTime":"2025-10-06T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.610591 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.628980 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.658509 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:32Z\\\",\\\"message\\\":\\\"mon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 13:04:32.221806 6511 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 13:04:32.221820 6511 tr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.675681 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.691818 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.705934 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.708665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.708759 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.708777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.708806 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.708826 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:33Z","lastTransitionTime":"2025-10-06T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.729553 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.744128 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.773169 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.798415 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.811808 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.811880 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.811894 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.811920 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.811937 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:33Z","lastTransitionTime":"2025-10-06T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.814569 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.838034 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.855563 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.881758 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.902081 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.915450 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.915501 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.915511 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.915528 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.915542 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:33Z","lastTransitionTime":"2025-10-06T13:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:33 crc kubenswrapper[4867]: I1006 13:04:33.918850 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:33Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.018339 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.018376 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.018386 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.018402 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.018412 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:34Z","lastTransitionTime":"2025-10-06T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.120960 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.121010 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.121020 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.121035 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.121047 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:34Z","lastTransitionTime":"2025-10-06T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.224703 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.224802 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.224834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.224870 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.224901 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:34Z","lastTransitionTime":"2025-10-06T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.328560 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.328685 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.328706 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.328735 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.328754 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:34Z","lastTransitionTime":"2025-10-06T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.432435 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.432511 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.432534 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.432565 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.432588 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:34Z","lastTransitionTime":"2025-10-06T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.535006 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.535069 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.535080 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.535095 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.535106 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:34Z","lastTransitionTime":"2025-10-06T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.638654 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.638731 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.638752 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.638787 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.638813 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:34Z","lastTransitionTime":"2025-10-06T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.742516 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.742633 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.742656 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.742688 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.742709 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:34Z","lastTransitionTime":"2025-10-06T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.846489 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.846553 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.846571 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.846599 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.846619 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:34Z","lastTransitionTime":"2025-10-06T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.950242 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.950343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.950358 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.950385 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:34 crc kubenswrapper[4867]: I1006 13:04:34.950403 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:34Z","lastTransitionTime":"2025-10-06T13:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.053844 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.053899 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.053914 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.053938 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.053954 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:35Z","lastTransitionTime":"2025-10-06T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.157079 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.157243 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.157299 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.157334 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.157364 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:35Z","lastTransitionTime":"2025-10-06T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.221513 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.221619 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.221581 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.221549 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:35 crc kubenswrapper[4867]: E1006 13:04:35.221785 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:35 crc kubenswrapper[4867]: E1006 13:04:35.221978 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:35 crc kubenswrapper[4867]: E1006 13:04:35.222292 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:35 crc kubenswrapper[4867]: E1006 13:04:35.222376 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.261185 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.261310 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.261359 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.261388 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.261407 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:35Z","lastTransitionTime":"2025-10-06T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.364278 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.364326 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.364339 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.364371 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.364383 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:35Z","lastTransitionTime":"2025-10-06T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.468808 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.468853 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.468863 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.468879 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.468889 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:35Z","lastTransitionTime":"2025-10-06T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.573451 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.573847 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.573859 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.573905 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.573919 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:35Z","lastTransitionTime":"2025-10-06T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.676423 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.676701 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.676771 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.676849 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.676959 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:35Z","lastTransitionTime":"2025-10-06T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.779970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.780288 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.780386 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.780465 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.780528 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:35Z","lastTransitionTime":"2025-10-06T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.883319 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.883387 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.883403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.883430 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.883446 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:35Z","lastTransitionTime":"2025-10-06T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.986705 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.986748 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.986758 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.986776 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:35 crc kubenswrapper[4867]: I1006 13:04:35.986786 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:35Z","lastTransitionTime":"2025-10-06T13:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.090405 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.090453 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.090465 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.090480 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.090492 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:36Z","lastTransitionTime":"2025-10-06T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.194151 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.194185 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.194194 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.194207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.194216 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:36Z","lastTransitionTime":"2025-10-06T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.276540 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.287619 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.289461 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.296988 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.297029 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.297042 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.297064 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.297077 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:36Z","lastTransitionTime":"2025-10-06T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.301000 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.314466 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.332446 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:32Z\\\",\\\"message\\\":\\\"mon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 13:04:32.221806 6511 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 13:04:32.221820 6511 tr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.345355 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.358429 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.376172 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.394500 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.400072 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.400164 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.400186 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.400219 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.400243 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:36Z","lastTransitionTime":"2025-10-06T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.416118 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.429655 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.445098 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.464328 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.477393 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.490522 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.503238 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.503334 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.503350 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.503369 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.503382 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:36Z","lastTransitionTime":"2025-10-06T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.509932 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.529954 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.544882 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:36Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.606884 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.606932 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.606943 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.606960 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.606974 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:36Z","lastTransitionTime":"2025-10-06T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.709595 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.709649 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.709661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.709683 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.709695 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:36Z","lastTransitionTime":"2025-10-06T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.813052 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.813119 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.813139 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.813171 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.813191 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:36Z","lastTransitionTime":"2025-10-06T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.916514 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.916609 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.916638 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.916677 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:36 crc kubenswrapper[4867]: I1006 13:04:36.916700 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:36Z","lastTransitionTime":"2025-10-06T13:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.020501 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.020579 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.020599 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.020630 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.020657 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.124846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.124932 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.124955 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.124986 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.125013 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.220413 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.220463 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.220413 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:37 crc kubenswrapper[4867]: E1006 13:04:37.220661 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.220604 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:37 crc kubenswrapper[4867]: E1006 13:04:37.220587 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:37 crc kubenswrapper[4867]: E1006 13:04:37.220965 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:37 crc kubenswrapper[4867]: E1006 13:04:37.221008 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.227108 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.227133 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.227143 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.227159 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.227170 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.329475 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.329518 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.329529 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.329546 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.329556 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.432005 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.432039 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.432049 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.432062 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.432071 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.535247 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.535328 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.535345 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.535371 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.535392 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.638233 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.638316 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.638326 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.638343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.638354 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.675311 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.675377 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.675392 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.675416 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.675429 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: E1006 13:04:37.688807 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:37Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.694668 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.694715 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.694728 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.694745 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.694759 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: E1006 13:04:37.706887 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:37Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.714755 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.714802 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.714814 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.714838 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.714853 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: E1006 13:04:37.727637 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:37Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.732152 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.732192 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.732204 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.732222 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.732233 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: E1006 13:04:37.745760 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:37Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.750966 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.751009 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.751020 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.751041 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.751054 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: E1006 13:04:37.764491 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:37Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:37 crc kubenswrapper[4867]: E1006 13:04:37.764611 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.766485 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.766513 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.766521 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.766535 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.766543 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.868774 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.868848 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.868866 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.868894 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.868913 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.971897 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.971960 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.971983 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.972005 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:37 crc kubenswrapper[4867]: I1006 13:04:37.972018 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:37Z","lastTransitionTime":"2025-10-06T13:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.074325 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.074359 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.074368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.074383 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.074395 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:38Z","lastTransitionTime":"2025-10-06T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.177493 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.177533 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.177544 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.177560 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.177572 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:38Z","lastTransitionTime":"2025-10-06T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.280417 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.280467 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.280478 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.280494 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.280507 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:38Z","lastTransitionTime":"2025-10-06T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.384393 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.384445 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.384453 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.384471 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.384482 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:38Z","lastTransitionTime":"2025-10-06T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.487725 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.487768 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.487778 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.487791 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.487799 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:38Z","lastTransitionTime":"2025-10-06T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.590536 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.590575 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.590594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.590613 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.590625 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:38Z","lastTransitionTime":"2025-10-06T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.693497 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.693544 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.693556 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.693571 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.693581 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:38Z","lastTransitionTime":"2025-10-06T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.796107 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.796145 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.796154 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.796167 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.796180 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:38Z","lastTransitionTime":"2025-10-06T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.899043 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.899100 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.899111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.899130 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:38 crc kubenswrapper[4867]: I1006 13:04:38.899144 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:38Z","lastTransitionTime":"2025-10-06T13:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.001555 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.001613 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.001621 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.001635 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.001651 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:39Z","lastTransitionTime":"2025-10-06T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.104318 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.104370 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.104379 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.104394 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.104403 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:39Z","lastTransitionTime":"2025-10-06T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.206335 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.206381 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.206393 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.206410 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.206429 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:39Z","lastTransitionTime":"2025-10-06T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.220729 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.220760 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.220759 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.220828 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:39 crc kubenswrapper[4867]: E1006 13:04:39.220827 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:39 crc kubenswrapper[4867]: E1006 13:04:39.220886 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:39 crc kubenswrapper[4867]: E1006 13:04:39.221003 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:39 crc kubenswrapper[4867]: E1006 13:04:39.221039 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.309014 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.309050 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.309058 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.309075 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.309083 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:39Z","lastTransitionTime":"2025-10-06T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.414575 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.414661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.414692 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.414727 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.414761 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:39Z","lastTransitionTime":"2025-10-06T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.517039 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.517079 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.517090 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.517102 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.517110 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:39Z","lastTransitionTime":"2025-10-06T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.619347 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.619389 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.619400 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.619416 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.619429 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:39Z","lastTransitionTime":"2025-10-06T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.721075 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.721318 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.721333 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.721353 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.721363 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:39Z","lastTransitionTime":"2025-10-06T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.823326 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.823397 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.823417 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.823435 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.823446 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:39Z","lastTransitionTime":"2025-10-06T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.925993 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.926027 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.926035 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.926050 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:39 crc kubenswrapper[4867]: I1006 13:04:39.926059 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:39Z","lastTransitionTime":"2025-10-06T13:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.028524 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.028553 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.028561 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.028573 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.028581 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:40Z","lastTransitionTime":"2025-10-06T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.130292 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.130334 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.130346 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.130363 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.130376 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:40Z","lastTransitionTime":"2025-10-06T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.233185 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.233223 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.233233 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.233245 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.233274 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:40Z","lastTransitionTime":"2025-10-06T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.335355 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.335411 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.335424 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.335444 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.335457 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:40Z","lastTransitionTime":"2025-10-06T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.437705 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.437745 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.437755 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.437769 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.437779 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:40Z","lastTransitionTime":"2025-10-06T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.539809 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.539852 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.539863 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.539880 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.539890 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:40Z","lastTransitionTime":"2025-10-06T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.642152 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.642190 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.642198 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.642215 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.642225 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:40Z","lastTransitionTime":"2025-10-06T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.743944 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.743977 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.743985 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.743997 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.744006 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:40Z","lastTransitionTime":"2025-10-06T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.846752 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.846867 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.846878 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.846894 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.846904 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:40Z","lastTransitionTime":"2025-10-06T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.948986 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.949018 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.949032 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.949047 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:40 crc kubenswrapper[4867]: I1006 13:04:40.949058 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:40Z","lastTransitionTime":"2025-10-06T13:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.051510 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.051541 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.051550 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.051563 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.051574 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:41Z","lastTransitionTime":"2025-10-06T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.154297 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.154343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.154352 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.154367 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.154377 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:41Z","lastTransitionTime":"2025-10-06T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.220360 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.220383 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.220380 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:41 crc kubenswrapper[4867]: E1006 13:04:41.220486 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.220521 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:41 crc kubenswrapper[4867]: E1006 13:04:41.220769 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:41 crc kubenswrapper[4867]: E1006 13:04:41.220868 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:41 crc kubenswrapper[4867]: E1006 13:04:41.220682 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.234620 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.247834 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.256990 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.257042 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.257052 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.257071 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.257082 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:41Z","lastTransitionTime":"2025-10-06T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.261398 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea14188e-cc78-44b7-836e-1ecded809d4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57599ded487b78c1787aa529a0d4d2d6b684c445eb626d9efe0ddcdab321b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df801e51e6fb3509c31d3ed26436fef886ed99e5b5ec1589d83646466d76e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://166367864dd2a7f9fc01cb33d0bc921e345533200879fd32f85edcf85d1763c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.280601 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.294155 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.305990 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.320636 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.334843 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.353756 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:32Z\\\",\\\"message\\\":\\\"mon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 13:04:32.221806 6511 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 13:04:32.221820 6511 tr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.358819 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.358859 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.358872 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.358887 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.358898 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:41Z","lastTransitionTime":"2025-10-06T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.366722 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.378334 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.391754 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.402746 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.420487 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.432783 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.443277 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.456035 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.461533 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.461568 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.461580 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.461596 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.461607 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:41Z","lastTransitionTime":"2025-10-06T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.468344 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:41Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.563418 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.563462 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.563473 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.563489 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.563499 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:41Z","lastTransitionTime":"2025-10-06T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.665492 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.665538 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.665547 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.665560 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.665569 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:41Z","lastTransitionTime":"2025-10-06T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.767703 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.767742 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.767751 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.767765 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.767775 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:41Z","lastTransitionTime":"2025-10-06T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.869602 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.869629 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.869637 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.869649 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.869658 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:41Z","lastTransitionTime":"2025-10-06T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.971399 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.971424 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.971433 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.971450 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:41 crc kubenswrapper[4867]: I1006 13:04:41.971459 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:41Z","lastTransitionTime":"2025-10-06T13:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.073750 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.073779 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.073787 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.073800 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.073808 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:42Z","lastTransitionTime":"2025-10-06T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.175273 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.175311 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.175319 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.175331 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.175339 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:42Z","lastTransitionTime":"2025-10-06T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.277698 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.277737 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.277747 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.277760 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.277773 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:42Z","lastTransitionTime":"2025-10-06T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.379804 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.379831 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.379838 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.379851 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.379859 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:42Z","lastTransitionTime":"2025-10-06T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.482489 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.482533 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.482544 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.482559 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.482571 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:42Z","lastTransitionTime":"2025-10-06T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.584673 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.584720 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.584734 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.584751 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.584762 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:42Z","lastTransitionTime":"2025-10-06T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.686658 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.686717 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.686737 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.686763 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.686785 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:42Z","lastTransitionTime":"2025-10-06T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.789096 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.789150 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.789164 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.789184 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.789197 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:42Z","lastTransitionTime":"2025-10-06T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.891925 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.892243 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.892279 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.892295 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.892305 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:42Z","lastTransitionTime":"2025-10-06T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.995404 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.995442 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.995452 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.995466 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:42 crc kubenswrapper[4867]: I1006 13:04:42.995478 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:42Z","lastTransitionTime":"2025-10-06T13:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.100078 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.100119 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.100129 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.100147 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.100161 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:43Z","lastTransitionTime":"2025-10-06T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.203615 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.203665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.203677 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.203695 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.203709 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:43Z","lastTransitionTime":"2025-10-06T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.221177 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:43 crc kubenswrapper[4867]: E1006 13:04:43.221290 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.221449 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:43 crc kubenswrapper[4867]: E1006 13:04:43.221497 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.221619 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.221691 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:43 crc kubenswrapper[4867]: E1006 13:04:43.221879 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:43 crc kubenswrapper[4867]: E1006 13:04:43.221988 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.306740 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.306803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.306817 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.306835 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.306847 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:43Z","lastTransitionTime":"2025-10-06T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.410362 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.410431 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.410450 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.410481 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.410517 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:43Z","lastTransitionTime":"2025-10-06T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.514622 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.514709 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.514730 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.514768 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.514795 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:43Z","lastTransitionTime":"2025-10-06T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.617885 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.617968 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.617990 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.618022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.618042 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:43Z","lastTransitionTime":"2025-10-06T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.720690 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.720729 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.720737 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.720750 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.720759 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:43Z","lastTransitionTime":"2025-10-06T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.823945 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.823998 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.824015 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.824037 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.824051 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:43Z","lastTransitionTime":"2025-10-06T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.926972 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.927231 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.927249 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.927304 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:43 crc kubenswrapper[4867]: I1006 13:04:43.927328 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:43Z","lastTransitionTime":"2025-10-06T13:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.031784 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.031896 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.031909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.031928 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.031945 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:44Z","lastTransitionTime":"2025-10-06T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.135336 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.135403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.135416 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.135433 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.135446 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:44Z","lastTransitionTime":"2025-10-06T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.238714 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.238775 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.238795 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.238817 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.238834 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:44Z","lastTransitionTime":"2025-10-06T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.340780 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.340823 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.340832 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.340847 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.340857 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:44Z","lastTransitionTime":"2025-10-06T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.444331 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.444392 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.444406 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.444427 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.444440 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:44Z","lastTransitionTime":"2025-10-06T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.547794 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.547876 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.547890 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.547919 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.547933 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:44Z","lastTransitionTime":"2025-10-06T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.650763 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.650823 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.650839 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.650860 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.650876 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:44Z","lastTransitionTime":"2025-10-06T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.753871 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.753928 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.753939 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.753956 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.753968 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:44Z","lastTransitionTime":"2025-10-06T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.857215 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.857504 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.857577 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.857655 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.857728 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:44Z","lastTransitionTime":"2025-10-06T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.961123 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.961185 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.961205 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.961233 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:44 crc kubenswrapper[4867]: I1006 13:04:44.961286 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:44Z","lastTransitionTime":"2025-10-06T13:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.064979 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.065027 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.065037 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.065056 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.065070 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:45Z","lastTransitionTime":"2025-10-06T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.167966 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.168002 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.168012 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.168026 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.168035 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:45Z","lastTransitionTime":"2025-10-06T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.223525 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.223586 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.223533 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:45 crc kubenswrapper[4867]: E1006 13:04:45.223771 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:45 crc kubenswrapper[4867]: E1006 13:04:45.223911 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:45 crc kubenswrapper[4867]: E1006 13:04:45.223986 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.224120 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:45 crc kubenswrapper[4867]: E1006 13:04:45.224174 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.271292 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.271338 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.271350 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.271371 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.271390 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:45Z","lastTransitionTime":"2025-10-06T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.374316 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.374356 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.374366 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.374387 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.374399 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:45Z","lastTransitionTime":"2025-10-06T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.477746 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.477835 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.477856 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.478343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.478666 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:45Z","lastTransitionTime":"2025-10-06T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.581746 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.581804 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.581817 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.581837 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.581850 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:45Z","lastTransitionTime":"2025-10-06T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.685870 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.685949 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.685970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.686008 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.686027 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:45Z","lastTransitionTime":"2025-10-06T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.789067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.789132 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.789151 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.789177 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.789195 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:45Z","lastTransitionTime":"2025-10-06T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.892514 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.892602 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.892624 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.892654 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.892677 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:45Z","lastTransitionTime":"2025-10-06T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.995826 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.995899 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.995912 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.995931 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:45 crc kubenswrapper[4867]: I1006 13:04:45.995960 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:45Z","lastTransitionTime":"2025-10-06T13:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.098516 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.098593 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.098606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.098629 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.098644 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:46Z","lastTransitionTime":"2025-10-06T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.201154 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.201220 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.201230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.201247 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.201280 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:46Z","lastTransitionTime":"2025-10-06T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.303985 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.304029 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.304039 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.304058 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.304069 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:46Z","lastTransitionTime":"2025-10-06T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.406953 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.406998 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.407007 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.407021 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.407029 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:46Z","lastTransitionTime":"2025-10-06T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.513356 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.513873 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.514015 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.514150 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.514310 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:46Z","lastTransitionTime":"2025-10-06T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.616676 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.616727 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.616741 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.616787 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.616801 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:46Z","lastTransitionTime":"2025-10-06T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.719943 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.720007 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.720018 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.720037 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.720051 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:46Z","lastTransitionTime":"2025-10-06T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.821860 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.821903 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.821915 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.821930 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.821941 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:46Z","lastTransitionTime":"2025-10-06T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.924288 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.924333 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.924346 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.924361 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:46 crc kubenswrapper[4867]: I1006 13:04:46.924372 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:46Z","lastTransitionTime":"2025-10-06T13:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.027155 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.027456 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.027532 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.027614 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.027685 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.131120 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.131161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.131186 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.131215 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.131228 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.220635 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.220856 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.220642 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:47 crc kubenswrapper[4867]: E1006 13:04:47.220871 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:47 crc kubenswrapper[4867]: E1006 13:04:47.221003 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.221016 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:47 crc kubenswrapper[4867]: E1006 13:04:47.221585 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:47 crc kubenswrapper[4867]: E1006 13:04:47.221700 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.222465 4867 scope.go:117] "RemoveContainer" containerID="b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58" Oct 06 13:04:47 crc kubenswrapper[4867]: E1006 13:04:47.222754 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.234781 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.234842 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.234862 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.234889 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.234914 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.321323 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs\") pod \"network-metrics-daemon-8t2sq\" (UID: \"b78c9415-85bd-40db-b44f-f1e04797a66e\") " pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:47 crc kubenswrapper[4867]: E1006 13:04:47.321592 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:04:47 crc kubenswrapper[4867]: E1006 13:04:47.321770 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs podName:b78c9415-85bd-40db-b44f-f1e04797a66e nodeName:}" failed. No retries permitted until 2025-10-06 13:05:19.321721658 +0000 UTC m=+98.779669992 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs") pod "network-metrics-daemon-8t2sq" (UID: "b78c9415-85bd-40db-b44f-f1e04797a66e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.338664 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.338713 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.338726 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.338745 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.338757 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.441586 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.441851 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.441970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.442052 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.442122 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.545065 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.545109 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.545120 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.545138 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.545150 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.647721 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.647765 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.647775 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.647795 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.647806 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.750148 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.750290 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.750318 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.750353 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.750386 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.853519 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.853569 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.853581 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.853596 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.853606 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.875930 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.876016 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.876040 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.876076 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.876110 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: E1006 13:04:47.895076 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:47Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.898786 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.898815 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.898824 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.898838 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.898866 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: E1006 13:04:47.911437 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:47Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.916999 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.917103 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.917131 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.917166 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.917190 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: E1006 13:04:47.932330 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:47Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.938075 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.938109 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.938121 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.938138 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.938151 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: E1006 13:04:47.958609 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:47Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.962785 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.962826 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.962838 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.962855 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.962866 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:47 crc kubenswrapper[4867]: E1006 13:04:47.981078 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:47Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:47 crc kubenswrapper[4867]: E1006 13:04:47.981303 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.983588 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.983633 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.983646 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.983670 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:47 crc kubenswrapper[4867]: I1006 13:04:47.983684 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:47Z","lastTransitionTime":"2025-10-06T13:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.086956 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.087017 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.087031 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.087050 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.087066 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:48Z","lastTransitionTime":"2025-10-06T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.190819 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.190880 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.190891 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.190905 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.190914 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:48Z","lastTransitionTime":"2025-10-06T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.293607 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.293648 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.293660 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.293675 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.293686 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:48Z","lastTransitionTime":"2025-10-06T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.396928 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.396972 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.397009 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.397024 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.397034 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:48Z","lastTransitionTime":"2025-10-06T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.499295 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.499359 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.499377 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.499405 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.499424 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:48Z","lastTransitionTime":"2025-10-06T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.603107 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.603181 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.603193 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.603209 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.603230 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:48Z","lastTransitionTime":"2025-10-06T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.706484 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.706562 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.706582 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.706618 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.706645 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:48Z","lastTransitionTime":"2025-10-06T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.808908 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.808948 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.808957 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.808970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.808981 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:48Z","lastTransitionTime":"2025-10-06T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.912198 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.912270 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.912287 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.912305 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:48 crc kubenswrapper[4867]: I1006 13:04:48.912319 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:48Z","lastTransitionTime":"2025-10-06T13:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.014733 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.014771 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.014781 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.014796 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.014809 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:49Z","lastTransitionTime":"2025-10-06T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.116702 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.116745 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.116756 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.116769 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.116779 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:49Z","lastTransitionTime":"2025-10-06T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.219867 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.219946 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.219967 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.219997 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.220022 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:49Z","lastTransitionTime":"2025-10-06T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.220448 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.220501 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.220527 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:49 crc kubenswrapper[4867]: E1006 13:04:49.220619 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.220735 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:49 crc kubenswrapper[4867]: E1006 13:04:49.220863 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:49 crc kubenswrapper[4867]: E1006 13:04:49.221056 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:49 crc kubenswrapper[4867]: E1006 13:04:49.221211 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.322662 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.322723 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.322737 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.322761 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.322777 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:49Z","lastTransitionTime":"2025-10-06T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.425341 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.425644 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.425657 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.425710 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.425726 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:49Z","lastTransitionTime":"2025-10-06T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.528205 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.528241 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.528271 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.528286 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.528297 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:49Z","lastTransitionTime":"2025-10-06T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.624406 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-knnfm_8e3bebeb-f8c1-4b1e-a320-b937eced1c3a/kube-multus/0.log" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.624459 4867 generic.go:334] "Generic (PLEG): container finished" podID="8e3bebeb-f8c1-4b1e-a320-b937eced1c3a" containerID="2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771" exitCode=1 Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.624488 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-knnfm" event={"ID":"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a","Type":"ContainerDied","Data":"2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771"} Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.624809 4867 scope.go:117] "RemoveContainer" containerID="2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.634330 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.634398 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.634417 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.634439 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.634456 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:49Z","lastTransitionTime":"2025-10-06T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.640016 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.659950 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:49Z\\\",\\\"message\\\":\\\"2025-10-06T13:04:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884\\\\n2025-10-06T13:04:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884 to /host/opt/cni/bin/\\\\n2025-10-06T13:04:03Z [verbose] multus-daemon started\\\\n2025-10-06T13:04:03Z [verbose] Readiness Indicator file check\\\\n2025-10-06T13:04:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.672664 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.692698 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:32Z\\\",\\\"message\\\":\\\"mon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 13:04:32.221806 6511 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 13:04:32.221820 6511 tr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.707340 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.728164 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.741533 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.741681 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.741737 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.741794 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.741821 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:49Z","lastTransitionTime":"2025-10-06T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.746838 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.772404 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.791191 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.809679 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.825436 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.839496 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.844546 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.844599 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.844612 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.844630 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.844642 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:49Z","lastTransitionTime":"2025-10-06T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.853608 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.871879 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.890621 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.905408 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea14188e-cc78-44b7-836e-1ecded809d4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57599ded487b78c1787aa529a0d4d2d6b684c445eb626d9efe0ddcdab321b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df801e51e6fb3509c31d3ed26436fef886ed99e5b5ec1589d83646466d76e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://166367864dd2a7f9fc01cb33d0bc921e345533200879fd32f85edcf85d1763c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.921436 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.938392 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:49Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.947446 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.947486 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.947503 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.947521 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:49 crc kubenswrapper[4867]: I1006 13:04:49.947536 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:49Z","lastTransitionTime":"2025-10-06T13:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.050276 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.050329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.050340 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.050358 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.050371 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:50Z","lastTransitionTime":"2025-10-06T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.152229 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.152312 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.152326 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.152345 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.152357 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:50Z","lastTransitionTime":"2025-10-06T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.254822 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.254879 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.254890 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.254905 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.254918 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:50Z","lastTransitionTime":"2025-10-06T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.358000 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.358039 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.358051 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.358064 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.358075 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:50Z","lastTransitionTime":"2025-10-06T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.460128 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.460166 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.460174 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.460189 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.460198 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:50Z","lastTransitionTime":"2025-10-06T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.562781 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.562851 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.562860 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.562876 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.562885 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:50Z","lastTransitionTime":"2025-10-06T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.628853 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-knnfm_8e3bebeb-f8c1-4b1e-a320-b937eced1c3a/kube-multus/0.log" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.628908 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-knnfm" event={"ID":"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a","Type":"ContainerStarted","Data":"5db25e15e7d1c63ff17fb8979f10d3d86b38b65942e72251cacf977ff9d031b4"} Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.650124 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:32Z\\\",\\\"message\\\":\\\"mon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 13:04:32.221806 6511 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 13:04:32.221820 6511 tr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.661701 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.665495 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.665555 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.665574 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.665596 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.665613 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:50Z","lastTransitionTime":"2025-10-06T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.676607 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.688700 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.699376 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.711190 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.721757 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.737412 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.746433 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.767709 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.767770 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.767785 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.767833 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.767850 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:50Z","lastTransitionTime":"2025-10-06T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.768728 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.783046 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.796178 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea14188e-cc78-44b7-836e-1ecded809d4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57599ded487b78c1787aa529a0d4d2d6b684c445eb626d9efe0ddcdab321b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df801e51e6fb3509c31d3ed26436fef886ed99e5b5ec1589d83646466d76e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://166367864dd2a7f9fc01cb33d0bc921e345533200879fd32f85edcf85d1763c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.808621 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.819879 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.834087 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.848430 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.859664 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.870527 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.870565 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.870574 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.870587 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.870702 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db25e15e7d1c63ff17fb8979f10d3d86b38b65942e72251cacf977ff9d031b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:49Z\\\",\\\"message\\\":\\\"2025-10-06T13:04:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884\\\\n2025-10-06T13:04:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884 to /host/opt/cni/bin/\\\\n2025-10-06T13:04:03Z [verbose] multus-daemon started\\\\n2025-10-06T13:04:03Z [verbose] Readiness Indicator file check\\\\n2025-10-06T13:04:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:50Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.870596 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:50Z","lastTransitionTime":"2025-10-06T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.972276 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.972303 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.972310 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.972323 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:50 crc kubenswrapper[4867]: I1006 13:04:50.972332 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:50Z","lastTransitionTime":"2025-10-06T13:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.073875 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.073915 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.073924 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.073940 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.073949 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:51Z","lastTransitionTime":"2025-10-06T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.176124 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.176155 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.176164 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.176178 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.176187 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:51Z","lastTransitionTime":"2025-10-06T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.220640 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.220701 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:51 crc kubenswrapper[4867]: E1006 13:04:51.220771 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.220640 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.220889 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:51 crc kubenswrapper[4867]: E1006 13:04:51.220944 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:51 crc kubenswrapper[4867]: E1006 13:04:51.221009 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:51 crc kubenswrapper[4867]: E1006 13:04:51.221042 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.232561 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.243093 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.251967 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.260446 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.272693 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.277859 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.277903 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.277915 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.277933 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.277944 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:51Z","lastTransitionTime":"2025-10-06T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.282281 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.298664 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.308835 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.319641 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea14188e-cc78-44b7-836e-1ecded809d4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57599ded487b78c1787aa529a0d4d2d6b684c445eb626d9efe0ddcdab321b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df801e51e6fb3509c31d3ed26436fef886ed99e5b5ec1589d83646466d76e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://166367864dd2a7f9fc01cb33d0bc921e345533200879fd32f85edcf85d1763c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.332614 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.346757 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.358457 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.368657 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.387548 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.387841 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.387945 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.388062 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.388154 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:51Z","lastTransitionTime":"2025-10-06T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.409104 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.427361 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db25e15e7d1c63ff17fb8979f10d3d86b38b65942e72251cacf977ff9d031b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:49Z\\\",\\\"message\\\":\\\"2025-10-06T13:04:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884\\\\n2025-10-06T13:04:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884 to /host/opt/cni/bin/\\\\n2025-10-06T13:04:03Z [verbose] multus-daemon started\\\\n2025-10-06T13:04:03Z [verbose] Readiness Indicator file check\\\\n2025-10-06T13:04:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.448807 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:32Z\\\",\\\"message\\\":\\\"mon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 13:04:32.221806 6511 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 13:04:32.221820 6511 tr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.461545 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.474289 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:51Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.490414 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.490578 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.490677 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.490767 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.490856 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:51Z","lastTransitionTime":"2025-10-06T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.594460 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.594642 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.594672 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.594781 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.594837 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:51Z","lastTransitionTime":"2025-10-06T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.696887 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.696941 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.696953 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.696970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.696981 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:51Z","lastTransitionTime":"2025-10-06T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.799010 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.799050 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.799058 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.799071 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.799081 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:51Z","lastTransitionTime":"2025-10-06T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.901057 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.901103 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.901113 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.901130 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:51 crc kubenswrapper[4867]: I1006 13:04:51.901142 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:51Z","lastTransitionTime":"2025-10-06T13:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.003483 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.003529 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.003538 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.003561 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.003569 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:52Z","lastTransitionTime":"2025-10-06T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.105878 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.105909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.105918 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.105932 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.105942 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:52Z","lastTransitionTime":"2025-10-06T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.208940 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.208988 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.208999 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.209016 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.209028 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:52Z","lastTransitionTime":"2025-10-06T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.312359 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.312406 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.312416 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.312433 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.312445 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:52Z","lastTransitionTime":"2025-10-06T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.414606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.414642 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.414655 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.414672 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.414685 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:52Z","lastTransitionTime":"2025-10-06T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.516647 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.516689 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.516699 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.516713 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.516723 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:52Z","lastTransitionTime":"2025-10-06T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.618474 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.618516 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.618528 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.618542 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.618555 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:52Z","lastTransitionTime":"2025-10-06T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.720247 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.720308 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.720316 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.720330 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.720339 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:52Z","lastTransitionTime":"2025-10-06T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.822435 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.822474 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.822484 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.822501 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.822512 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:52Z","lastTransitionTime":"2025-10-06T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.924637 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.924691 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.924702 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.924716 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:52 crc kubenswrapper[4867]: I1006 13:04:52.924724 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:52Z","lastTransitionTime":"2025-10-06T13:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.026688 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.026729 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.026740 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.026754 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.026763 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:53Z","lastTransitionTime":"2025-10-06T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.129552 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.129601 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.129616 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.129635 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.129646 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:53Z","lastTransitionTime":"2025-10-06T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.221050 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.221110 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.221075 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.221050 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:53 crc kubenswrapper[4867]: E1006 13:04:53.221193 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:53 crc kubenswrapper[4867]: E1006 13:04:53.221323 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:53 crc kubenswrapper[4867]: E1006 13:04:53.221397 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:53 crc kubenswrapper[4867]: E1006 13:04:53.221510 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.231866 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.231934 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.231950 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.231967 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.231978 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:53Z","lastTransitionTime":"2025-10-06T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.334369 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.334428 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.334440 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.334456 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.334466 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:53Z","lastTransitionTime":"2025-10-06T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.436606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.436644 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.436654 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.436668 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.436679 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:53Z","lastTransitionTime":"2025-10-06T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.538878 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.538917 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.538928 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.538942 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.538954 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:53Z","lastTransitionTime":"2025-10-06T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.641237 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.641290 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.641301 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.641320 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.641331 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:53Z","lastTransitionTime":"2025-10-06T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.743443 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.743480 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.743490 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.743506 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.743516 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:53Z","lastTransitionTime":"2025-10-06T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.846387 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.847052 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.847167 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.847309 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.847428 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:53Z","lastTransitionTime":"2025-10-06T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.950091 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.950133 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.950143 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.950157 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:53 crc kubenswrapper[4867]: I1006 13:04:53.950166 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:53Z","lastTransitionTime":"2025-10-06T13:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.052885 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.052925 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.052934 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.052948 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.052957 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:54Z","lastTransitionTime":"2025-10-06T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.155018 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.155047 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.155055 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.155067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.155075 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:54Z","lastTransitionTime":"2025-10-06T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.257109 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.257168 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.257177 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.257189 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.257198 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:54Z","lastTransitionTime":"2025-10-06T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.359544 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.359586 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.359594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.359609 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.359618 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:54Z","lastTransitionTime":"2025-10-06T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.461562 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.461608 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.461619 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.461633 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.461645 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:54Z","lastTransitionTime":"2025-10-06T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.563883 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.563927 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.563936 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.563949 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.563960 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:54Z","lastTransitionTime":"2025-10-06T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.667308 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.667349 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.667357 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.667372 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.667381 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:54Z","lastTransitionTime":"2025-10-06T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.769134 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.769177 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.769186 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.769201 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.769211 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:54Z","lastTransitionTime":"2025-10-06T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.871669 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.871727 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.871735 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.871749 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.871758 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:54Z","lastTransitionTime":"2025-10-06T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.974016 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.974058 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.974066 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.974081 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:54 crc kubenswrapper[4867]: I1006 13:04:54.974095 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:54Z","lastTransitionTime":"2025-10-06T13:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.076727 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.076763 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.076771 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.076786 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.076798 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:55Z","lastTransitionTime":"2025-10-06T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.179037 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.179078 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.179088 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.179102 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.179114 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:55Z","lastTransitionTime":"2025-10-06T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.221006 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.221082 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.221029 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:55 crc kubenswrapper[4867]: E1006 13:04:55.221142 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:55 crc kubenswrapper[4867]: E1006 13:04:55.221203 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.221048 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:55 crc kubenswrapper[4867]: E1006 13:04:55.221303 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:55 crc kubenswrapper[4867]: E1006 13:04:55.221428 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.281793 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.281835 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.281845 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.281858 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.281868 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:55Z","lastTransitionTime":"2025-10-06T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.384343 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.384383 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.384392 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.384405 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.384416 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:55Z","lastTransitionTime":"2025-10-06T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.486776 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.486820 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.486832 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.486846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.486858 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:55Z","lastTransitionTime":"2025-10-06T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.589358 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.589395 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.589403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.589417 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.589427 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:55Z","lastTransitionTime":"2025-10-06T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.691156 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.691207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.691218 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.691232 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.691242 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:55Z","lastTransitionTime":"2025-10-06T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.793669 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.793719 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.793735 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.793756 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.793769 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:55Z","lastTransitionTime":"2025-10-06T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.896365 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.896408 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.896417 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.896432 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.896443 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:55Z","lastTransitionTime":"2025-10-06T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.998640 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.998688 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.998700 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.998718 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:55 crc kubenswrapper[4867]: I1006 13:04:55.998730 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:55Z","lastTransitionTime":"2025-10-06T13:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.100934 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.100977 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.100989 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.101006 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.101017 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:56Z","lastTransitionTime":"2025-10-06T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.203387 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.203442 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.203455 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.203472 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.203487 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:56Z","lastTransitionTime":"2025-10-06T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.232346 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.305801 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.305838 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.305847 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.305862 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.305872 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:56Z","lastTransitionTime":"2025-10-06T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.409005 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.409046 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.409056 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.409090 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.409099 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:56Z","lastTransitionTime":"2025-10-06T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.511328 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.511377 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.511389 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.511409 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.511421 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:56Z","lastTransitionTime":"2025-10-06T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.614029 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.614077 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.614090 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.614107 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.614119 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:56Z","lastTransitionTime":"2025-10-06T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.715930 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.715971 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.715983 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.715999 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.716010 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:56Z","lastTransitionTime":"2025-10-06T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.818373 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.818449 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.818460 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.818473 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.818482 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:56Z","lastTransitionTime":"2025-10-06T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.920981 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.921052 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.921075 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.921108 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:56 crc kubenswrapper[4867]: I1006 13:04:56.921132 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:56Z","lastTransitionTime":"2025-10-06T13:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.023061 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.023113 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.023124 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.023141 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.023153 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:57Z","lastTransitionTime":"2025-10-06T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.126218 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.126275 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.126285 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.126299 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.126308 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:57Z","lastTransitionTime":"2025-10-06T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.221537 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.221623 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.221545 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.221563 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:57 crc kubenswrapper[4867]: E1006 13:04:57.221729 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:57 crc kubenswrapper[4867]: E1006 13:04:57.221830 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:57 crc kubenswrapper[4867]: E1006 13:04:57.221908 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:57 crc kubenswrapper[4867]: E1006 13:04:57.221952 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.228273 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.228300 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.228308 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.228322 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.228330 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:57Z","lastTransitionTime":"2025-10-06T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.330818 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.330861 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.330872 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.330889 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.330901 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:57Z","lastTransitionTime":"2025-10-06T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.433776 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.433813 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.433823 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.433837 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.433847 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:57Z","lastTransitionTime":"2025-10-06T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.535706 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.535745 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.535753 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.535768 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.535785 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:57Z","lastTransitionTime":"2025-10-06T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.638861 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.638909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.638922 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.638939 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.638951 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:57Z","lastTransitionTime":"2025-10-06T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.741628 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.741670 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.741682 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.741699 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.741711 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:57Z","lastTransitionTime":"2025-10-06T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.843774 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.843817 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.843828 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.843842 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.843851 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:57Z","lastTransitionTime":"2025-10-06T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.945570 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.945605 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.945614 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.945627 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:57 crc kubenswrapper[4867]: I1006 13:04:57.945637 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:57Z","lastTransitionTime":"2025-10-06T13:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.048607 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.048653 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.048662 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.048676 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.048689 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:58Z","lastTransitionTime":"2025-10-06T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.105773 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.105822 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.105837 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.105856 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.105869 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:58Z","lastTransitionTime":"2025-10-06T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:58 crc kubenswrapper[4867]: E1006 13:04:58.118119 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:58Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.122591 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.122636 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.122649 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.122666 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.122677 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:58Z","lastTransitionTime":"2025-10-06T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:58 crc kubenswrapper[4867]: E1006 13:04:58.135895 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:58Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.141477 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.141549 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.141611 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.141631 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.141646 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:58Z","lastTransitionTime":"2025-10-06T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:58 crc kubenswrapper[4867]: E1006 13:04:58.155353 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:58Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.158361 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.158425 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.158442 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.158467 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.158484 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:58Z","lastTransitionTime":"2025-10-06T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:58 crc kubenswrapper[4867]: E1006 13:04:58.172939 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:58Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.176656 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.176707 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.176726 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.176747 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.176764 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:58Z","lastTransitionTime":"2025-10-06T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:58 crc kubenswrapper[4867]: E1006 13:04:58.189039 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:04:58Z is after 2025-08-24T17:21:41Z" Oct 06 13:04:58 crc kubenswrapper[4867]: E1006 13:04:58.189155 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.190963 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.190991 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.191005 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.191021 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.191033 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:58Z","lastTransitionTime":"2025-10-06T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.293346 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.293387 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.293395 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.293409 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.293421 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:58Z","lastTransitionTime":"2025-10-06T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.395866 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.395909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.395918 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.395931 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.395939 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:58Z","lastTransitionTime":"2025-10-06T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.498569 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.498638 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.498661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.498689 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.498709 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:58Z","lastTransitionTime":"2025-10-06T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.601324 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.601391 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.601402 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.601419 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.601433 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:58Z","lastTransitionTime":"2025-10-06T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.703744 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.703816 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.703838 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.703868 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.703889 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:58Z","lastTransitionTime":"2025-10-06T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.807665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.807713 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.807724 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.807742 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.807753 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:58Z","lastTransitionTime":"2025-10-06T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.910414 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.910465 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.910478 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.910497 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:58 crc kubenswrapper[4867]: I1006 13:04:58.910509 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:58Z","lastTransitionTime":"2025-10-06T13:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.014172 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.014242 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.014305 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.014329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.014347 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:59Z","lastTransitionTime":"2025-10-06T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.117022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.117054 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.117065 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.117080 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.117091 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:59Z","lastTransitionTime":"2025-10-06T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.219616 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.219871 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.219883 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.219895 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.219905 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:59Z","lastTransitionTime":"2025-10-06T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.220187 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.220200 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:04:59 crc kubenswrapper[4867]: E1006 13:04:59.220306 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.220360 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.220354 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:04:59 crc kubenswrapper[4867]: E1006 13:04:59.220649 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:04:59 crc kubenswrapper[4867]: E1006 13:04:59.220723 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:04:59 crc kubenswrapper[4867]: E1006 13:04:59.220570 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.323040 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.323078 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.323090 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.323106 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.323116 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:59Z","lastTransitionTime":"2025-10-06T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.425627 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.425671 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.425683 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.425697 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.425708 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:59Z","lastTransitionTime":"2025-10-06T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.527974 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.528046 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.528064 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.528531 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.528592 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:59Z","lastTransitionTime":"2025-10-06T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.631665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.631711 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.631722 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.631740 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.631752 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:59Z","lastTransitionTime":"2025-10-06T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.734657 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.734695 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.734704 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.734717 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.734726 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:59Z","lastTransitionTime":"2025-10-06T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.837731 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.837783 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.837790 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.837805 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.837814 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:59Z","lastTransitionTime":"2025-10-06T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.940227 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.940352 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.940374 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.940407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:04:59 crc kubenswrapper[4867]: I1006 13:04:59.940429 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:04:59Z","lastTransitionTime":"2025-10-06T13:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.043099 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.043199 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.043213 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.043230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.043242 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:00Z","lastTransitionTime":"2025-10-06T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.147204 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.147309 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.147328 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.147355 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.147373 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:00Z","lastTransitionTime":"2025-10-06T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.250769 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.250812 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.250820 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.250833 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.250844 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:00Z","lastTransitionTime":"2025-10-06T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.353871 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.353943 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.353954 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.353967 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.353977 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:00Z","lastTransitionTime":"2025-10-06T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.457648 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.457681 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.457689 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.457702 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.457711 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:00Z","lastTransitionTime":"2025-10-06T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.560819 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.560859 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.560871 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.560889 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.560900 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:00Z","lastTransitionTime":"2025-10-06T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.663799 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.663846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.663857 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.663873 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.663884 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:00Z","lastTransitionTime":"2025-10-06T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.767131 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.767191 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.767202 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.767221 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.767232 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:00Z","lastTransitionTime":"2025-10-06T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.869880 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.869927 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.869937 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.869954 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.869965 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:00Z","lastTransitionTime":"2025-10-06T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.972964 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.972997 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.973005 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.973020 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:00 crc kubenswrapper[4867]: I1006 13:05:00.973030 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:00Z","lastTransitionTime":"2025-10-06T13:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.075300 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.075329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.075338 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.075352 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.075361 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:01Z","lastTransitionTime":"2025-10-06T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.177341 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.177374 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.177382 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.177395 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.177404 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:01Z","lastTransitionTime":"2025-10-06T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.220515 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.220522 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.220559 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.220887 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:01 crc kubenswrapper[4867]: E1006 13:05:01.221080 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:01 crc kubenswrapper[4867]: E1006 13:05:01.221194 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:01 crc kubenswrapper[4867]: E1006 13:05:01.221138 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:01 crc kubenswrapper[4867]: E1006 13:05:01.221288 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.231909 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7198ddc9-af6d-43a8-bf5b-14c096c4b05f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195d38cd77f948a851f2f1d0343b56091b81045e48249f91f7c2ee086f4aa430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52bb29e16b6159a854911ba772bb70664326cf3b8151e9f157df0e5c09a3dac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52bb29e16b6159a854911ba772bb70664326cf3b8151e9f157df0e5c09a3dac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.246594 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.266109 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:32Z\\\",\\\"message\\\":\\\"mon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 13:04:32.221806 6511 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 13:04:32.221820 6511 tr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.277316 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.279509 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.279591 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.279620 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.279695 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.279721 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:01Z","lastTransitionTime":"2025-10-06T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.288032 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.303412 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.313443 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.330587 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.342827 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.354779 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.367204 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.377315 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.381944 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.381992 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.382006 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.382027 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.382042 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:01Z","lastTransitionTime":"2025-10-06T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.388159 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.398801 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.414761 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea14188e-cc78-44b7-836e-1ecded809d4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57599ded487b78c1787aa529a0d4d2d6b684c445eb626d9efe0ddcdab321b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df801e51e6fb3509c31d3ed26436fef886ed99e5b5ec1589d83646466d76e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://166367864dd2a7f9fc01cb33d0bc921e345533200879fd32f85edcf85d1763c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.426627 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.435759 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.446666 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.459810 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db25e15e7d1c63ff17fb8979f10d3d86b38b65942e72251cacf977ff9d031b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:49Z\\\",\\\"message\\\":\\\"2025-10-06T13:04:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884\\\\n2025-10-06T13:04:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884 to /host/opt/cni/bin/\\\\n2025-10-06T13:04:03Z [verbose] multus-daemon started\\\\n2025-10-06T13:04:03Z [verbose] Readiness Indicator file check\\\\n2025-10-06T13:04:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:01Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.484489 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.484524 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.484533 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.484546 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.484555 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:01Z","lastTransitionTime":"2025-10-06T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.586170 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.586438 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.586534 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.586624 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.586702 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:01Z","lastTransitionTime":"2025-10-06T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.689736 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.689791 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.689804 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.689821 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.689836 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:01Z","lastTransitionTime":"2025-10-06T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.792201 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.792460 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.792573 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.792674 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.792768 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:01Z","lastTransitionTime":"2025-10-06T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.895280 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.895325 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.895337 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.895361 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.895376 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:01Z","lastTransitionTime":"2025-10-06T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.998038 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.998077 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.998087 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.998102 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:01 crc kubenswrapper[4867]: I1006 13:05:01.998112 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:01Z","lastTransitionTime":"2025-10-06T13:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.100686 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.101186 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.101271 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.101363 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.101446 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:02Z","lastTransitionTime":"2025-10-06T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.203946 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.203981 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.203989 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.204007 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.204015 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:02Z","lastTransitionTime":"2025-10-06T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.221524 4867 scope.go:117] "RemoveContainer" containerID="b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.306111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.306168 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.306186 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.306209 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.306225 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:02Z","lastTransitionTime":"2025-10-06T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.408114 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.408531 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.408912 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.409134 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.409395 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:02Z","lastTransitionTime":"2025-10-06T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.511802 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.511826 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.511836 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.511849 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.511857 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:02Z","lastTransitionTime":"2025-10-06T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.613652 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.613691 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.613699 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.613714 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.613725 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:02Z","lastTransitionTime":"2025-10-06T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.660686 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/2.log" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.663599 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerStarted","Data":"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c"} Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.664635 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.685596 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.699131 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db25e15e7d1c63ff17fb8979f10d3d86b38b65942e72251cacf977ff9d031b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:49Z\\\",\\\"message\\\":\\\"2025-10-06T13:04:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884\\\\n2025-10-06T13:04:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884 to /host/opt/cni/bin/\\\\n2025-10-06T13:04:03Z [verbose] multus-daemon started\\\\n2025-10-06T13:04:03Z [verbose] Readiness Indicator file check\\\\n2025-10-06T13:04:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.709438 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7198ddc9-af6d-43a8-bf5b-14c096c4b05f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195d38cd77f948a851f2f1d0343b56091b81045e48249f91f7c2ee086f4aa430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52bb29e16b6159a854911ba772bb70664326cf3b8151e9f157df0e5c09a3dac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52bb29e16b6159a854911ba772bb70664326cf3b8151e9f157df0e5c09a3dac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.716475 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.716514 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.716527 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.716543 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.716557 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:02Z","lastTransitionTime":"2025-10-06T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.728152 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.746186 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:32Z\\\",\\\"message\\\":\\\"mon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 13:04:32.221806 6511 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 13:04:32.221820 6511 tr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.769354 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.780616 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.790434 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.804008 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.812906 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.819151 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.819185 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.819195 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.819209 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.819221 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:02Z","lastTransitionTime":"2025-10-06T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.830283 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.841643 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.852549 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.862929 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.872961 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.892508 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.904308 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.921154 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.921199 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.921216 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.921230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.921118 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea14188e-cc78-44b7-836e-1ecded809d4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57599ded487b78c1787aa529a0d4d2d6b684c445eb626d9efe0ddcdab321b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df801e51e6fb3509c31d3ed26436fef886ed99e5b5ec1589d83646466d76e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://166367864dd2a7f9fc01cb33d0bc921e345533200879fd32f85edcf85d1763c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.921240 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:02Z","lastTransitionTime":"2025-10-06T13:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:02 crc kubenswrapper[4867]: I1006 13:05:02.937023 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:02Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.023240 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.023301 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.023313 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.023329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.023341 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:03Z","lastTransitionTime":"2025-10-06T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.125597 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.125639 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.125654 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.125672 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.125683 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:03Z","lastTransitionTime":"2025-10-06T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.220744 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.220787 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.220838 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.220862 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:03 crc kubenswrapper[4867]: E1006 13:05:03.221166 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:03 crc kubenswrapper[4867]: E1006 13:05:03.221320 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:03 crc kubenswrapper[4867]: E1006 13:05:03.221395 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:03 crc kubenswrapper[4867]: E1006 13:05:03.221507 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.227348 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.227398 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.227407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.227418 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.227428 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:03Z","lastTransitionTime":"2025-10-06T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.329003 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.329034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.329042 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.329054 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.329064 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:03Z","lastTransitionTime":"2025-10-06T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.431544 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.431584 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.431594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.431609 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.431619 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:03Z","lastTransitionTime":"2025-10-06T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.534093 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.534125 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.534134 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.534146 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.534155 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:03Z","lastTransitionTime":"2025-10-06T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.635994 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.636058 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.636074 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.636101 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.636120 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:03Z","lastTransitionTime":"2025-10-06T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.667409 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/3.log" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.668124 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/2.log" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.670645 4867 generic.go:334] "Generic (PLEG): container finished" podID="93569a52-4f36-4017-9834-b3651d6cd63e" containerID="ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c" exitCode=1 Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.670690 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerDied","Data":"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c"} Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.670731 4867 scope.go:117] "RemoveContainer" containerID="b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.671939 4867 scope.go:117] "RemoveContainer" containerID="ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c" Oct 06 13:05:03 crc kubenswrapper[4867]: E1006 13:05:03.672222 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.686487 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.697856 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db25e15e7d1c63ff17fb8979f10d3d86b38b65942e72251cacf977ff9d031b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:49Z\\\",\\\"message\\\":\\\"2025-10-06T13:04:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884\\\\n2025-10-06T13:04:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884 to /host/opt/cni/bin/\\\\n2025-10-06T13:04:03Z [verbose] multus-daemon started\\\\n2025-10-06T13:04:03Z [verbose] Readiness Indicator file check\\\\n2025-10-06T13:04:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.717451 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7c2cc107a3adf8dd695ee7cea3cfd74a8f4fca68b4ff46b4b0c91d4e963bc58\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:32Z\\\",\\\"message\\\":\\\"mon]} name:Service_openshift-machine-config-operator/machine-config-daemon_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.43:8798: 10.217.4.43:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {a36f6289-d09f-43f8-8a8a-c9d2cc11eb0d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 13:04:32.221806 6511 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 13:04:32.221820 6511 tr\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:05:03Z\\\",\\\"message\\\":\\\"penshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 13:05:03.123933 6934 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1006 13:05:03.123940 6934 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF1006 13:05:03.123945 6934 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.727348 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.738084 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.738118 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.738129 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.738144 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.738156 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:03Z","lastTransitionTime":"2025-10-06T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.743587 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7198ddc9-af6d-43a8-bf5b-14c096c4b05f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195d38cd77f948a851f2f1d0343b56091b81045e48249f91f7c2ee086f4aa430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52bb29e16b6159a854911ba772bb70664326cf3b8151e9f157df0e5c09a3dac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52bb29e16b6159a854911ba772bb70664326cf3b8151e9f157df0e5c09a3dac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.755580 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.766435 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.777632 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.787708 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.796333 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.808768 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.817538 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.838461 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.840737 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.840806 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.840830 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.840853 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.840870 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:03Z","lastTransitionTime":"2025-10-06T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.848513 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.859507 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea14188e-cc78-44b7-836e-1ecded809d4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57599ded487b78c1787aa529a0d4d2d6b684c445eb626d9efe0ddcdab321b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df801e51e6fb3509c31d3ed26436fef886ed99e5b5ec1589d83646466d76e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://166367864dd2a7f9fc01cb33d0bc921e345533200879fd32f85edcf85d1763c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.869702 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.880012 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.893448 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.904795 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:03Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.943161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.943207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.943219 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.943235 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:03 crc kubenswrapper[4867]: I1006 13:05:03.943247 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:03Z","lastTransitionTime":"2025-10-06T13:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.045721 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.047759 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.047791 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.047832 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.047854 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:04Z","lastTransitionTime":"2025-10-06T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.150442 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.150499 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.150508 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.150521 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.150580 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:04Z","lastTransitionTime":"2025-10-06T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.252333 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.252367 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.252376 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.252389 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.252399 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:04Z","lastTransitionTime":"2025-10-06T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.354815 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.354843 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.354852 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.354865 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.354873 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:04Z","lastTransitionTime":"2025-10-06T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.457225 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.457357 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.457375 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.457403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.457421 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:04Z","lastTransitionTime":"2025-10-06T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.559470 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.559534 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.559551 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.559578 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.559596 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:04Z","lastTransitionTime":"2025-10-06T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.662732 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.662787 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.662803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.662826 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.662844 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:04Z","lastTransitionTime":"2025-10-06T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.675469 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/3.log" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.680356 4867 scope.go:117] "RemoveContainer" containerID="ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c" Oct 06 13:05:04 crc kubenswrapper[4867]: E1006 13:05:04.680599 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.691723 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db25e15e7d1c63ff17fb8979f10d3d86b38b65942e72251cacf977ff9d031b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:49Z\\\",\\\"message\\\":\\\"2025-10-06T13:04:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884\\\\n2025-10-06T13:04:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884 to /host/opt/cni/bin/\\\\n2025-10-06T13:04:03Z [verbose] multus-daemon started\\\\n2025-10-06T13:04:03Z [verbose] Readiness Indicator file check\\\\n2025-10-06T13:04:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.704645 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.720067 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.748650 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:05:03Z\\\",\\\"message\\\":\\\"penshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 13:05:03.123933 6934 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1006 13:05:03.123940 6934 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF1006 13:05:03.123945 6934 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:05:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.759484 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.765408 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.765447 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.765489 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.765505 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.765516 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:04Z","lastTransitionTime":"2025-10-06T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.772295 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7198ddc9-af6d-43a8-bf5b-14c096c4b05f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195d38cd77f948a851f2f1d0343b56091b81045e48249f91f7c2ee086f4aa430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52bb29e16b6159a854911ba772bb70664326cf3b8151e9f157df0e5c09a3dac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52bb29e16b6159a854911ba772bb70664326cf3b8151e9f157df0e5c09a3dac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.786467 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.799315 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.812883 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.824072 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.836797 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.850350 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.862773 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.868678 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.868723 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.868734 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.868749 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.868758 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:04Z","lastTransitionTime":"2025-10-06T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.883530 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.895025 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.905165 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea14188e-cc78-44b7-836e-1ecded809d4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57599ded487b78c1787aa529a0d4d2d6b684c445eb626d9efe0ddcdab321b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df801e51e6fb3509c31d3ed26436fef886ed99e5b5ec1589d83646466d76e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://166367864dd2a7f9fc01cb33d0bc921e345533200879fd32f85edcf85d1763c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.918585 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.928024 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.939063 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:04Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.970611 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.970645 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.970653 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.970665 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:04 crc kubenswrapper[4867]: I1006 13:05:04.970672 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:04Z","lastTransitionTime":"2025-10-06T13:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.010219 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.010359 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.010388 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.010362301 +0000 UTC m=+148.468310445 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.010435 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.010485 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.010502 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.010543 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.010504 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.010624 4867 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.010640 4867 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.010654 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.010684 4867 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.010693 4867 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.010556 4867 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.010669 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.010658649 +0000 UTC m=+148.468606803 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.010752 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.010732771 +0000 UTC m=+148.468680985 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.010784 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.010775352 +0000 UTC m=+148.468723646 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.010800 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.010791772 +0000 UTC m=+148.468740036 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.072996 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.073026 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.073035 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.073049 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.073058 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:05Z","lastTransitionTime":"2025-10-06T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.175215 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.175284 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.175298 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.175317 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.175329 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:05Z","lastTransitionTime":"2025-10-06T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.220987 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.221019 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.221050 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.221141 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.221163 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.221273 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.221342 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:05 crc kubenswrapper[4867]: E1006 13:05:05.221491 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.277669 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.277703 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.277715 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.277748 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.277759 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:05Z","lastTransitionTime":"2025-10-06T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.380380 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.380476 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.380490 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.380510 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.380524 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:05Z","lastTransitionTime":"2025-10-06T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.483493 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.483533 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.483541 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.483557 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.483569 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:05Z","lastTransitionTime":"2025-10-06T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.587182 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.587235 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.587266 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.587283 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.587294 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:05Z","lastTransitionTime":"2025-10-06T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.689441 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.689506 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.689525 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.689548 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.689565 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:05Z","lastTransitionTime":"2025-10-06T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.791902 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.791967 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.791982 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.792001 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.792013 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:05Z","lastTransitionTime":"2025-10-06T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.895135 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.895178 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.895189 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.895211 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.895225 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:05Z","lastTransitionTime":"2025-10-06T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.998056 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.998097 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.998129 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.998150 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:05 crc kubenswrapper[4867]: I1006 13:05:05.998163 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:05Z","lastTransitionTime":"2025-10-06T13:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.101184 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.101225 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.101238 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.101283 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.101297 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:06Z","lastTransitionTime":"2025-10-06T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.204498 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.204549 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.204561 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.204580 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.204594 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:06Z","lastTransitionTime":"2025-10-06T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.306605 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.306739 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.306757 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.306782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.306795 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:06Z","lastTransitionTime":"2025-10-06T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.409292 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.409380 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.409398 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.409424 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.409442 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:06Z","lastTransitionTime":"2025-10-06T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.511719 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.511756 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.511764 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.511778 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.511788 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:06Z","lastTransitionTime":"2025-10-06T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.614404 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.614467 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.614479 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.614496 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.614508 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:06Z","lastTransitionTime":"2025-10-06T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.716478 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.716519 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.716531 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.716544 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.716553 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:06Z","lastTransitionTime":"2025-10-06T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.818776 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.818822 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.818838 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.818854 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.818866 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:06Z","lastTransitionTime":"2025-10-06T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.921196 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.921242 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.921288 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.921306 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:06 crc kubenswrapper[4867]: I1006 13:05:06.921318 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:06Z","lastTransitionTime":"2025-10-06T13:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.024523 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.024565 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.024577 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.024594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.024605 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:07Z","lastTransitionTime":"2025-10-06T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.128498 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.128585 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.128598 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.128617 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.128635 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:07Z","lastTransitionTime":"2025-10-06T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.221337 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.221332 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:07 crc kubenswrapper[4867]: E1006 13:05:07.221565 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.221355 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:07 crc kubenswrapper[4867]: E1006 13:05:07.221481 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.221343 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:07 crc kubenswrapper[4867]: E1006 13:05:07.221672 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:07 crc kubenswrapper[4867]: E1006 13:05:07.221729 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.230720 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.230781 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.230794 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.230808 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.230819 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:07Z","lastTransitionTime":"2025-10-06T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.333664 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.333728 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.333747 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.333771 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.333787 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:07Z","lastTransitionTime":"2025-10-06T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.436467 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.436506 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.436516 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.436532 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.436543 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:07Z","lastTransitionTime":"2025-10-06T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.539142 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.539171 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.539185 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.539215 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.539226 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:07Z","lastTransitionTime":"2025-10-06T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.641465 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.641510 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.641521 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.641536 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.641548 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:07Z","lastTransitionTime":"2025-10-06T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.744679 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.744738 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.744754 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.744777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.744796 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:07Z","lastTransitionTime":"2025-10-06T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.846777 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.846854 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.846867 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.846884 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.846922 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:07Z","lastTransitionTime":"2025-10-06T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.948562 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.948596 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.948604 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.948619 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:07 crc kubenswrapper[4867]: I1006 13:05:07.948632 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:07Z","lastTransitionTime":"2025-10-06T13:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.051108 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.051153 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.051164 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.051208 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.051220 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.153508 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.153546 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.153560 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.153575 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.153586 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.256471 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.256536 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.256549 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.256565 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.256576 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.359584 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.359623 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.359631 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.359646 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.359694 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.461924 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.461967 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.461976 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.461990 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.461998 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.494485 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.494516 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.494525 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.494538 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.494545 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: E1006 13:05:08.506291 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.509597 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.509630 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.509641 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.509656 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.509667 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: E1006 13:05:08.521834 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.525450 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.525478 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.525488 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.525503 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.525516 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: E1006 13:05:08.536982 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.540198 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.540262 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.540273 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.540288 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.540298 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: E1006 13:05:08.552990 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.557610 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.557670 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.557688 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.557710 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.557727 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: E1006 13:05:08.575058 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:08Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:08 crc kubenswrapper[4867]: E1006 13:05:08.575329 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.577098 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.577162 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.577187 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.577208 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.577225 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.680509 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.680577 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.680594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.680632 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.680655 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.782816 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.782871 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.782888 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.782907 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.782918 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.885234 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.885331 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.885348 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.885371 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.885388 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.988352 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.988429 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.988451 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.988475 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:08 crc kubenswrapper[4867]: I1006 13:05:08.988493 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:08Z","lastTransitionTime":"2025-10-06T13:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.091197 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.091316 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.091342 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.091378 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.091400 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:09Z","lastTransitionTime":"2025-10-06T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.193838 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.193883 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.193892 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.193907 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.193919 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:09Z","lastTransitionTime":"2025-10-06T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.220538 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.220556 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.220639 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.220688 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:09 crc kubenswrapper[4867]: E1006 13:05:09.220926 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:09 crc kubenswrapper[4867]: E1006 13:05:09.221002 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:09 crc kubenswrapper[4867]: E1006 13:05:09.221182 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:09 crc kubenswrapper[4867]: E1006 13:05:09.221319 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.296375 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.296450 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.296474 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.296496 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.296513 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:09Z","lastTransitionTime":"2025-10-06T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.398929 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.398992 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.399004 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.399021 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.399034 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:09Z","lastTransitionTime":"2025-10-06T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.501092 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.501172 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.501296 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.501332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.501350 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:09Z","lastTransitionTime":"2025-10-06T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.603326 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.603368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.603381 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.603395 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.603407 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:09Z","lastTransitionTime":"2025-10-06T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.706198 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.706493 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.706561 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.706641 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.706725 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:09Z","lastTransitionTime":"2025-10-06T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.809383 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.809414 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.809423 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.809437 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.809446 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:09Z","lastTransitionTime":"2025-10-06T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.912195 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.912236 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.912265 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.912283 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:09 crc kubenswrapper[4867]: I1006 13:05:09.912295 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:09Z","lastTransitionTime":"2025-10-06T13:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.014800 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.014847 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.014858 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.014872 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.014883 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:10Z","lastTransitionTime":"2025-10-06T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.117757 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.117793 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.117803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.117818 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.117828 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:10Z","lastTransitionTime":"2025-10-06T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.220018 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.220057 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.220065 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.220077 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.220086 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:10Z","lastTransitionTime":"2025-10-06T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.322466 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.322531 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.322541 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.322555 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.322566 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:10Z","lastTransitionTime":"2025-10-06T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.425407 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.425482 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.425496 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.425511 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.425524 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:10Z","lastTransitionTime":"2025-10-06T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.528075 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.528109 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.528118 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.528132 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.528140 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:10Z","lastTransitionTime":"2025-10-06T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.630686 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.630722 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.630731 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.630745 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.630755 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:10Z","lastTransitionTime":"2025-10-06T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.732612 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.732667 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.732680 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.732707 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.732721 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:10Z","lastTransitionTime":"2025-10-06T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.835814 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.835873 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.835918 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.835957 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.836169 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:10Z","lastTransitionTime":"2025-10-06T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.938822 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.938876 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.938892 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.938908 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:10 crc kubenswrapper[4867]: I1006 13:05:10.938921 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:10Z","lastTransitionTime":"2025-10-06T13:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.041133 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.041189 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.041199 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.041216 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.041227 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:11Z","lastTransitionTime":"2025-10-06T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.143695 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.143759 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.143768 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.143783 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.143794 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:11Z","lastTransitionTime":"2025-10-06T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.220683 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.220774 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:11 crc kubenswrapper[4867]: E1006 13:05:11.220857 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.220776 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:11 crc kubenswrapper[4867]: E1006 13:05:11.220922 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:11 crc kubenswrapper[4867]: E1006 13:05:11.221038 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.221462 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:11 crc kubenswrapper[4867]: E1006 13:05:11.221655 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.235475 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rssjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52bd1ba-10f1-40c3-a0e7-f6e051234752\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e89ec59f51bd31c759cb799e90088ca2d97256d5b89ae06bd1d1e418576c7d05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d3edbde731ae6ae3deb5a896b764bd659a382b01bdf26a3edd6018312fba12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75b73fceb46bbc2c99f6c6bc81a74cff5dcbfadd1a085ad2874638d9c909ab67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad499f3ef928b4570fb2eddbce5f1bf0a4f7d7410456a0cdd88a266bc48570b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05753a5d687fb3e5d813b64a5e480124072c22b9824f281777a02bb8e49c5e34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d814fce784f1168f62025c53fa4d5a8bd125c6dc6eddfdee9b33248379873cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab745879c73647ff838bf70ea4ea02b26a4691cc3c191e31878fd545fe00757b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhnn5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rssjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.245923 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x2x4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fce7eafd-a44a-4e15-b02e-30800f29c4e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceed21087adc115681695f6608c5fb6e4b6bb5f3bdfedb5eb4c085cb8d7b52ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n27vn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x2x4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.246566 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.246718 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.246837 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.246935 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.247054 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:11Z","lastTransitionTime":"2025-10-06T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.270446 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b252440-514a-4d96-8b89-0ac441119c01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d9085c2269dc0bf2ebada11976925c9d13ba30819d162b5ac76ef548dd478d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24a5404bb1d4702da993833694f2a56a4e146de3dd1bb2241e189b2c32b892a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a82df8a56c298f16a3cefd54f73ceba129103ea6d5a56c4d59aad4943b118fd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://550e402a95b1e0c3fbf017edbf4601a36f539b3dd030874983442c8fdb7fb8e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa621ff0dcd1d15b3050a96cd27dc5dc5bfd1ae842d219afe278b780435e9265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75569fb56822ed9f658014c6e37f83ad8224c3cc334886d04e825492979d400b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858ed2e69f02f0f73fc6f982df341518a03f1724856cb90d5ce5603c09ed93e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9371dbbd931d16c126dc0afa23ec6171057733008b547744ecc13f6ccba74723\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.283490 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.293853 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24f41f47f9dbd65b529075697a5770de9799421af1af5c06c3b9e93b3d3648b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.306785 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.318558 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5dc284-392f-4e65-9f43-cb9ced2e47d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d40c1c4508c5f31557ac8d4fa9881699c57a3e881518ebd49fce3a747385acb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-897ln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-shmxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.330813 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sdmmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3daf6dcd-ed6a-4a39-892d-2c65de264a48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407cbafe28c9b981229e673383675d4ffad082e229e896c93c9947112ac9872f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-scfz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sdmmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.349645 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.349697 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.349713 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.349734 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.349751 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:11Z","lastTransitionTime":"2025-10-06T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.354807 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62259c92-74c6-4913-9db3-47ece70581b8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0488998f41ec05df5a6552e54431ab560d40a6d65d0afc23979df15a07c03097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9013bdb3394ee3700ea5b91692983ff13140084f18e67e56c6cc5e0f0281ec63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f0460c302e02d39d059a252474a521546faa90405fe65376cfb1e790a38b464\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3546472df4d8acb2e1affe636b31348ca43e59405e4fa4f3dec08172012f5d1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b8d9d1ecb2b6f5564cb85d2d57d730246bf8d089405025b81cc65f4d65728e8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T13:04:00Z\\\",\\\"message\\\":\\\"entication::requestheader-client-ca-file\\\\nI1006 13:04:00.447532 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1006 13:04:00.447553 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1006 13:04:00.447942 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1006 13:04:00.448332 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\"\\\\nI1006 13:04:00.449395 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-2311859286/tls.crt::/tmp/serving-cert-2311859286/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1759755824\\\\\\\\\\\\\\\" (2025-10-06 13:03:43 +0000 UTC to 2025-11-05 13:03:44 +0000 UTC (now=2025-10-06 13:04:00.449356996 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449889 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1759755835\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1759755834\\\\\\\\\\\\\\\" (2025-10-06 12:03:54 +0000 UTC to 2026-10-06 12:03:54 +0000 UTC (now=2025-10-06 13:04:00.449857628 +0000 UTC))\\\\\\\"\\\\nI1006 13:04:00.449928 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1006 13:04:00.449958 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1006 13:04:00.450029 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1006 13:04:00.450595 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1006 13:04:00.451955 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1006 13:04:00.451951 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b2423d2f957ecd9526af53e375310a3535d3cfb7777a8c11b74a9632254ba5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5176098ff5d8a644cb55bf4a57192289cc1eb4954058efefe4e17aa4b7e82254\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.368883 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b77f199c-7c90-4a9b-b2db-0585585c4288\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d85c306c45b4a13b69c03c130fbf0c38bf21f1c0011ddebb79c5d29a16d803d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f05f81541b1e2510dc6f3bf7504bc3703a4e436997857bea18cd1ce754badb42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc7cadb3105d611120ce3e26d05733f2918daed600c5b09ce0f319f0fdbf6666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0db600a2544226eee1fbedfbf1da20acfdf99905145ded6f9e961e7858dea81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.381037 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea14188e-cc78-44b7-836e-1ecded809d4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b57599ded487b78c1787aa529a0d4d2d6b684c445eb626d9efe0ddcdab321b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df801e51e6fb3509c31d3ed26436fef886ed99e5b5ec1589d83646466d76e49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://166367864dd2a7f9fc01cb33d0bc921e345533200879fd32f85edcf85d1763c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c00cf577052cb0c437d58bafe1da83f5b0a97735d207bac2369fab9b0128cdfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.393040 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb276e800dbef23ebf475ab7e9ac24118f5d870f3e0740efa85d4f0f244d8ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.403048 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c693b796-691d-4cc2-8d01-a0589e8833ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba1f8e5e4373522d645ef0e4cdd870ae38927315d139af8eef3f5f8b6e82257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76dc3bab3643c68ba3c0d7bd1e5e71f7dbdb05d46fae50e398040b9aa3f86ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwpkb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-f4ldd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.415549 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.426453 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-knnfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5db25e15e7d1c63ff17fb8979f10d3d86b38b65942e72251cacf977ff9d031b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:04:49Z\\\",\\\"message\\\":\\\"2025-10-06T13:04:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884\\\\n2025-10-06T13:04:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ac922cf7-fdf1-4424-b2ae-81f3f126a884 to /host/opt/cni/bin/\\\\n2025-10-06T13:04:03Z [verbose] multus-daemon started\\\\n2025-10-06T13:04:03Z [verbose] Readiness Indicator file check\\\\n2025-10-06T13:04:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fhm7v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-multus\"/\"multus-knnfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.436335 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7198ddc9-af6d-43a8-bf5b-14c096c4b05f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:03:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195d38cd77f948a851f2f1d0343b56091b81045e48249f91f7c2ee086f4aa430\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52bb29e16b6159a854911ba772bb70664326cf3b8151e9f157df0e5c09a3dac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52bb29e16b6159a854911ba772bb70664326cf3b8151e9f157df0e5c09a3dac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:03:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:03:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:03:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.448545 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e962dea92773102bfc613c0727e33ceb4b077a79d391c42347e29df9fbc7c74c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e0f5a95f7d97c5f7c42413753434e854aa5a97b34e3c185723214275b6fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.452055 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.452085 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.452094 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.452105 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.452114 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:11Z","lastTransitionTime":"2025-10-06T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.467278 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93569a52-4f36-4017-9834-b3651d6cd63e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T13:05:03Z\\\",\\\"message\\\":\\\"penshift-kube-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.219\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 13:05:03.123933 6934 services_controller.go:452] Built service openshift-kube-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1006 13:05:03.123940 6934 services_controller.go:453] Built service openshift-kube-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF1006 13:05:03.123945 6934 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T13:05:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T13:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T13:04:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T13:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlc7z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.477730 4867 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b78c9415-85bd-40db-b44f-f1e04797a66e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T13:04:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbqrf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T13:04:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8t2sq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:11Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.554796 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.554843 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.554853 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.554868 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.554880 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:11Z","lastTransitionTime":"2025-10-06T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.656111 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.656144 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.656152 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.656164 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.656177 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:11Z","lastTransitionTime":"2025-10-06T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.758475 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.758513 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.758522 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.758538 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.758549 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:11Z","lastTransitionTime":"2025-10-06T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.860788 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.860828 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.860839 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.860854 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.860866 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:11Z","lastTransitionTime":"2025-10-06T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.963489 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.963549 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.963558 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.963572 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:11 crc kubenswrapper[4867]: I1006 13:05:11.963582 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:11Z","lastTransitionTime":"2025-10-06T13:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.065941 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.065984 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.065992 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.066009 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.066018 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:12Z","lastTransitionTime":"2025-10-06T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.168301 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.168346 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.168359 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.168373 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.168383 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:12Z","lastTransitionTime":"2025-10-06T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.270563 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.270608 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.270617 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.270631 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.270640 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:12Z","lastTransitionTime":"2025-10-06T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.372160 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.372208 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.372224 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.372242 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.372282 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:12Z","lastTransitionTime":"2025-10-06T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.474529 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.474564 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.474572 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.474587 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.474596 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:12Z","lastTransitionTime":"2025-10-06T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.577426 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.577459 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.577492 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.577509 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.577520 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:12Z","lastTransitionTime":"2025-10-06T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.680022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.680056 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.680065 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.680077 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.680086 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:12Z","lastTransitionTime":"2025-10-06T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.782295 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.782332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.782342 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.782356 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.782366 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:12Z","lastTransitionTime":"2025-10-06T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.885428 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.885471 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.885480 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.885495 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.885508 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:12Z","lastTransitionTime":"2025-10-06T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.987787 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.987819 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.987828 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.987842 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:12 crc kubenswrapper[4867]: I1006 13:05:12.987853 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:12Z","lastTransitionTime":"2025-10-06T13:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.089794 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.089844 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.089854 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.089869 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.089878 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:13Z","lastTransitionTime":"2025-10-06T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.192823 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.192864 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.192894 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.192909 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.192918 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:13Z","lastTransitionTime":"2025-10-06T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.220371 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.220440 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:13 crc kubenswrapper[4867]: E1006 13:05:13.220480 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.220530 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:13 crc kubenswrapper[4867]: E1006 13:05:13.220642 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.220689 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:13 crc kubenswrapper[4867]: E1006 13:05:13.221106 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:13 crc kubenswrapper[4867]: E1006 13:05:13.221318 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.295020 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.295060 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.295071 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.295089 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.295100 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:13Z","lastTransitionTime":"2025-10-06T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.396939 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.396990 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.397030 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.397048 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.397060 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:13Z","lastTransitionTime":"2025-10-06T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.499555 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.499590 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.499598 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.499630 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.499640 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:13Z","lastTransitionTime":"2025-10-06T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.602093 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.602143 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.602154 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.602171 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.602182 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:13Z","lastTransitionTime":"2025-10-06T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.704741 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.704805 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.704823 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.704846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.704865 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:13Z","lastTransitionTime":"2025-10-06T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.807206 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.807302 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.807323 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.807353 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.807378 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:13Z","lastTransitionTime":"2025-10-06T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.909171 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.909223 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.909240 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.909302 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:13 crc kubenswrapper[4867]: I1006 13:05:13.909319 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:13Z","lastTransitionTime":"2025-10-06T13:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.011271 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.011300 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.011310 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.011322 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.011330 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:14Z","lastTransitionTime":"2025-10-06T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.114414 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.114471 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.114483 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.114504 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.114518 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:14Z","lastTransitionTime":"2025-10-06T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.217276 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.217352 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.217369 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.217396 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.217413 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:14Z","lastTransitionTime":"2025-10-06T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.319489 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.319527 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.319538 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.319550 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.319560 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:14Z","lastTransitionTime":"2025-10-06T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.421865 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.421988 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.422005 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.422340 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.422355 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:14Z","lastTransitionTime":"2025-10-06T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.524517 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.524577 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.524589 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.524608 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.524623 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:14Z","lastTransitionTime":"2025-10-06T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.627195 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.627232 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.627242 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.627272 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.627281 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:14Z","lastTransitionTime":"2025-10-06T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.729202 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.729245 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.729277 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.729293 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.729306 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:14Z","lastTransitionTime":"2025-10-06T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.831552 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.831602 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.831610 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.831625 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.831635 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:14Z","lastTransitionTime":"2025-10-06T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.933678 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.933714 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.933742 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.933757 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:14 crc kubenswrapper[4867]: I1006 13:05:14.933767 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:14Z","lastTransitionTime":"2025-10-06T13:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.035471 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.035507 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.035517 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.035532 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.035542 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:15Z","lastTransitionTime":"2025-10-06T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.137781 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.137867 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.137890 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.137918 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.137939 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:15Z","lastTransitionTime":"2025-10-06T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.220863 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.220971 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.221110 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:15 crc kubenswrapper[4867]: E1006 13:05:15.221123 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.221151 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:15 crc kubenswrapper[4867]: E1006 13:05:15.221245 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:15 crc kubenswrapper[4867]: E1006 13:05:15.221402 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:15 crc kubenswrapper[4867]: E1006 13:05:15.221567 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.240036 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.240108 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.240127 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.240150 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.240168 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:15Z","lastTransitionTime":"2025-10-06T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.343074 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.343129 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.343146 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.343167 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.343180 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:15Z","lastTransitionTime":"2025-10-06T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.446989 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.447057 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.447079 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.447106 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.447128 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:15Z","lastTransitionTime":"2025-10-06T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.550334 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.550376 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.550388 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.550405 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.550415 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:15Z","lastTransitionTime":"2025-10-06T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.653789 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.653856 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.653880 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.653910 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.653935 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:15Z","lastTransitionTime":"2025-10-06T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.757208 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.757272 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.757287 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.757305 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.757317 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:15Z","lastTransitionTime":"2025-10-06T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.860113 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.860167 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.860185 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.860208 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.860226 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:15Z","lastTransitionTime":"2025-10-06T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.963631 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.963703 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.963727 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.963755 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:15 crc kubenswrapper[4867]: I1006 13:05:15.963779 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:15Z","lastTransitionTime":"2025-10-06T13:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.066038 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.066455 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.066475 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.066498 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.066515 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:16Z","lastTransitionTime":"2025-10-06T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.169093 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.169126 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.169137 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.169154 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.169164 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:16Z","lastTransitionTime":"2025-10-06T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.271603 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.271636 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.271645 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.271661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.271670 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:16Z","lastTransitionTime":"2025-10-06T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.373429 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.373464 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.373476 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.373492 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.373505 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:16Z","lastTransitionTime":"2025-10-06T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.475797 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.475845 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.475857 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.475874 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.475888 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:16Z","lastTransitionTime":"2025-10-06T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.577690 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.577725 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.577734 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.577769 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.577780 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:16Z","lastTransitionTime":"2025-10-06T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.680187 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.680220 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.680229 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.680243 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.680273 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:16Z","lastTransitionTime":"2025-10-06T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.782959 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.783085 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.783108 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.783140 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.783163 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:16Z","lastTransitionTime":"2025-10-06T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.885661 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.885719 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.885730 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.885744 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.885754 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:16Z","lastTransitionTime":"2025-10-06T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.988021 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.988060 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.988068 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.988081 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:16 crc kubenswrapper[4867]: I1006 13:05:16.988090 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:16Z","lastTransitionTime":"2025-10-06T13:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.091027 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.091076 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.091085 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.091099 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.091107 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:17Z","lastTransitionTime":"2025-10-06T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.193535 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.193566 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.193574 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.193586 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.193594 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:17Z","lastTransitionTime":"2025-10-06T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.221032 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.221085 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.221099 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.221060 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:17 crc kubenswrapper[4867]: E1006 13:05:17.221202 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:17 crc kubenswrapper[4867]: E1006 13:05:17.221329 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:17 crc kubenswrapper[4867]: E1006 13:05:17.221651 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:17 crc kubenswrapper[4867]: E1006 13:05:17.221770 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.222152 4867 scope.go:117] "RemoveContainer" containerID="ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c" Oct 06 13:05:17 crc kubenswrapper[4867]: E1006 13:05:17.222335 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.296291 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.296325 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.296336 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.296372 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.296382 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:17Z","lastTransitionTime":"2025-10-06T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.399319 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.399380 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.399392 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.399408 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.399422 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:17Z","lastTransitionTime":"2025-10-06T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.501758 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.501798 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.501810 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.501825 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.501836 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:17Z","lastTransitionTime":"2025-10-06T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.604301 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.604355 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.604364 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.604383 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.604392 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:17Z","lastTransitionTime":"2025-10-06T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.706823 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.706887 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.706899 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.706914 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.706926 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:17Z","lastTransitionTime":"2025-10-06T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.808790 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.808833 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.808841 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.808856 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.808866 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:17Z","lastTransitionTime":"2025-10-06T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.911110 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.911181 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.911193 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.911210 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:17 crc kubenswrapper[4867]: I1006 13:05:17.911225 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:17Z","lastTransitionTime":"2025-10-06T13:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.013948 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.013992 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.014007 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.014028 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.014042 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.116818 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.117066 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.117182 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.117349 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.117526 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.219993 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.220323 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.220394 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.220460 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.220521 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.322750 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.322789 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.322798 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.322812 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.322821 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.425672 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.425783 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.425803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.425837 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.425858 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.528603 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.528679 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.528696 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.528730 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.528748 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.632083 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.632181 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.632205 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.632235 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.632303 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.735187 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.735297 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.735339 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.735375 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.735397 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.839311 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.839359 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.839376 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.839395 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.839405 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.863223 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.863313 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.863327 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.863348 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.863361 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: E1006 13:05:18.878436 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.883463 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.883527 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.883541 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.883566 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.883589 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: E1006 13:05:18.896360 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.900081 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.900221 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.900281 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.900335 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.900365 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: E1006 13:05:18.916555 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.921015 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.921064 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.921079 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.921100 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.921117 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: E1006 13:05:18.935673 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.940549 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.940588 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.940599 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.940617 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.940663 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:18 crc kubenswrapper[4867]: E1006 13:05:18.955369 4867 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T13:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f975b18a-cd36-4c7e-a04c-71b0f488ca5c\\\",\\\"systemUUID\\\":\\\"bd3761ce-1fa1-4021-80de-a06d0f4530ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T13:05:18Z is after 2025-08-24T17:21:41Z" Oct 06 13:05:18 crc kubenswrapper[4867]: E1006 13:05:18.955481 4867 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.957535 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.957563 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.957573 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.957586 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:18 crc kubenswrapper[4867]: I1006 13:05:18.957594 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:18Z","lastTransitionTime":"2025-10-06T13:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.059320 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.059366 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.059378 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.059394 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.059406 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:19Z","lastTransitionTime":"2025-10-06T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.162201 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.162230 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.162239 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.162273 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.162286 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:19Z","lastTransitionTime":"2025-10-06T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.221514 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.221628 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:19 crc kubenswrapper[4867]: E1006 13:05:19.221799 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.222096 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.222100 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:19 crc kubenswrapper[4867]: E1006 13:05:19.222399 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:19 crc kubenswrapper[4867]: E1006 13:05:19.222560 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:19 crc kubenswrapper[4867]: E1006 13:05:19.222638 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.265024 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.265091 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.265108 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.265137 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.265153 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:19Z","lastTransitionTime":"2025-10-06T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.360762 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs\") pod \"network-metrics-daemon-8t2sq\" (UID: \"b78c9415-85bd-40db-b44f-f1e04797a66e\") " pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:19 crc kubenswrapper[4867]: E1006 13:05:19.361008 4867 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:05:19 crc kubenswrapper[4867]: E1006 13:05:19.361115 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs podName:b78c9415-85bd-40db-b44f-f1e04797a66e nodeName:}" failed. No retries permitted until 2025-10-06 13:06:23.361089045 +0000 UTC m=+162.819037409 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs") pod "network-metrics-daemon-8t2sq" (UID: "b78c9415-85bd-40db-b44f-f1e04797a66e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.367997 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.368057 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.368073 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.368100 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.368117 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:19Z","lastTransitionTime":"2025-10-06T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.470623 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.470658 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.470669 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.470685 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.470694 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:19Z","lastTransitionTime":"2025-10-06T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.578236 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.578318 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.578332 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.578355 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.578368 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:19Z","lastTransitionTime":"2025-10-06T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.680814 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.680890 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.680908 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.680940 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.680960 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:19Z","lastTransitionTime":"2025-10-06T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.784022 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.784073 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.784084 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.784108 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.784119 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:19Z","lastTransitionTime":"2025-10-06T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.887102 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.887174 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.887186 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.887207 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.887221 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:19Z","lastTransitionTime":"2025-10-06T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.990472 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.990545 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.990563 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.990594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:19 crc kubenswrapper[4867]: I1006 13:05:19.990612 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:19Z","lastTransitionTime":"2025-10-06T13:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.093309 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.093680 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.093831 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.093927 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.094036 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:20Z","lastTransitionTime":"2025-10-06T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.196539 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.196582 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.196594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.196610 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.196621 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:20Z","lastTransitionTime":"2025-10-06T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.298672 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.298718 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.298730 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.298746 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.298757 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:20Z","lastTransitionTime":"2025-10-06T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.401130 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.401180 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.401193 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.401211 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.401223 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:20Z","lastTransitionTime":"2025-10-06T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.502819 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.502852 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.502861 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.502877 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.502888 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:20Z","lastTransitionTime":"2025-10-06T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.605450 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.605491 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.605500 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.605516 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.605527 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:20Z","lastTransitionTime":"2025-10-06T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.707699 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.707743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.707757 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.707773 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.707785 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:20Z","lastTransitionTime":"2025-10-06T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.809650 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.809709 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.809723 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.809743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.809843 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:20Z","lastTransitionTime":"2025-10-06T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.911921 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.911974 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.911990 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.912009 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:20 crc kubenswrapper[4867]: I1006 13:05:20.912024 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:20Z","lastTransitionTime":"2025-10-06T13:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.014542 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.014591 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.014603 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.014620 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.014632 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:21Z","lastTransitionTime":"2025-10-06T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.116738 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.116794 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.116803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.116816 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.116827 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:21Z","lastTransitionTime":"2025-10-06T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.219879 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.219974 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.219985 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.220002 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.220013 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:21Z","lastTransitionTime":"2025-10-06T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.220231 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.220298 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:21 crc kubenswrapper[4867]: E1006 13:05:21.220343 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.220411 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.220416 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:21 crc kubenswrapper[4867]: E1006 13:05:21.220560 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:21 crc kubenswrapper[4867]: E1006 13:05:21.220627 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:21 crc kubenswrapper[4867]: E1006 13:05:21.220685 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.255129 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-knnfm" podStartSLOduration=80.255106596 podStartE2EDuration="1m20.255106596s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:05:21.255009944 +0000 UTC m=+100.712958098" watchObservedRunningTime="2025-10-06 13:05:21.255106596 +0000 UTC m=+100.713054740" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.265128 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=25.265110435 podStartE2EDuration="25.265110435s" podCreationTimestamp="2025-10-06 13:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:05:21.264905549 +0000 UTC m=+100.722853693" watchObservedRunningTime="2025-10-06 13:05:21.265110435 +0000 UTC m=+100.723058579" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.322737 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.322781 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.322792 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.322809 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.322819 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:21Z","lastTransitionTime":"2025-10-06T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.327005 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podStartSLOduration=80.326991133 podStartE2EDuration="1m20.326991133s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:05:21.326888391 +0000 UTC m=+100.784836555" watchObservedRunningTime="2025-10-06 13:05:21.326991133 +0000 UTC m=+100.784939277" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.355900 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sdmmb" podStartSLOduration=80.35588269 podStartE2EDuration="1m20.35588269s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:05:21.339105436 +0000 UTC m=+100.797053580" watchObservedRunningTime="2025-10-06 13:05:21.35588269 +0000 UTC m=+100.813830834" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.356010 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rssjd" podStartSLOduration=80.356004423 podStartE2EDuration="1m20.356004423s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:05:21.355422528 +0000 UTC m=+100.813370692" watchObservedRunningTime="2025-10-06 13:05:21.356004423 +0000 UTC m=+100.813952567" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.364650 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-x2x4x" podStartSLOduration=80.364634336 podStartE2EDuration="1m20.364634336s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:05:21.363946778 +0000 UTC m=+100.821894922" watchObservedRunningTime="2025-10-06 13:05:21.364634336 +0000 UTC m=+100.822582480" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.385556 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=81.385541346 podStartE2EDuration="1m21.385541346s" podCreationTimestamp="2025-10-06 13:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:05:21.385032383 +0000 UTC m=+100.842980557" watchObservedRunningTime="2025-10-06 13:05:21.385541346 +0000 UTC m=+100.843489490" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.424695 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.424741 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.424752 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.424772 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.424783 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:21Z","lastTransitionTime":"2025-10-06T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.463139 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-f4ldd" podStartSLOduration=80.46310759 podStartE2EDuration="1m20.46310759s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:05:21.440682811 +0000 UTC m=+100.898630955" watchObservedRunningTime="2025-10-06 13:05:21.46310759 +0000 UTC m=+100.921055734" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.464009 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=80.464003793 podStartE2EDuration="1m20.464003793s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:05:21.463583042 +0000 UTC m=+100.921531196" watchObservedRunningTime="2025-10-06 13:05:21.464003793 +0000 UTC m=+100.921951937" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.484610 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=78.484594495 podStartE2EDuration="1m18.484594495s" podCreationTimestamp="2025-10-06 13:04:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:05:21.48360235 +0000 UTC m=+100.941550484" watchObservedRunningTime="2025-10-06 13:05:21.484594495 +0000 UTC m=+100.942542639" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.496217 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=45.496202585 podStartE2EDuration="45.496202585s" podCreationTimestamp="2025-10-06 13:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:05:21.495833026 +0000 UTC m=+100.953781170" watchObservedRunningTime="2025-10-06 13:05:21.496202585 +0000 UTC m=+100.954150729" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.526970 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.527020 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.527034 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.527054 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.527065 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:21Z","lastTransitionTime":"2025-10-06T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.629519 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.629576 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.629591 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.629610 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.629629 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:21Z","lastTransitionTime":"2025-10-06T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.732087 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.732124 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.732133 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.732145 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.732157 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:21Z","lastTransitionTime":"2025-10-06T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.834468 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.834504 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.834512 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.834527 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.834535 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:21Z","lastTransitionTime":"2025-10-06T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.938208 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.938270 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.938284 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.938309 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:21 crc kubenswrapper[4867]: I1006 13:05:21.938324 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:21Z","lastTransitionTime":"2025-10-06T13:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.040842 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.040915 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.040934 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.040959 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.040978 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:22Z","lastTransitionTime":"2025-10-06T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.143448 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.143477 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.143485 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.143497 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.143505 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:22Z","lastTransitionTime":"2025-10-06T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.245626 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.245663 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.245674 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.245689 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.245700 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:22Z","lastTransitionTime":"2025-10-06T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.347691 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.347721 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.347730 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.347743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.347752 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:22Z","lastTransitionTime":"2025-10-06T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.450192 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.450266 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.450276 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.450290 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.450298 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:22Z","lastTransitionTime":"2025-10-06T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.552654 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.552693 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.552704 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.552721 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.552733 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:22Z","lastTransitionTime":"2025-10-06T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.655346 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.655384 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.655394 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.655413 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.655430 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:22Z","lastTransitionTime":"2025-10-06T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.758176 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.758212 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.758220 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.758235 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.758246 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:22Z","lastTransitionTime":"2025-10-06T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.860192 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.860228 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.860239 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.860271 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.860285 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:22Z","lastTransitionTime":"2025-10-06T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.962270 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.962308 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.962319 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.962334 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:22 crc kubenswrapper[4867]: I1006 13:05:22.962348 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:22Z","lastTransitionTime":"2025-10-06T13:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.065104 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.065147 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.065158 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.065173 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.065184 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:23Z","lastTransitionTime":"2025-10-06T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.167220 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.167275 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.167284 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.167298 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.167307 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:23Z","lastTransitionTime":"2025-10-06T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.221068 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.221123 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.221193 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:23 crc kubenswrapper[4867]: E1006 13:05:23.221191 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:23 crc kubenswrapper[4867]: E1006 13:05:23.221327 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.221378 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:23 crc kubenswrapper[4867]: E1006 13:05:23.221402 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:23 crc kubenswrapper[4867]: E1006 13:05:23.221488 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.269121 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.269154 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.269163 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.269177 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.269187 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:23Z","lastTransitionTime":"2025-10-06T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.371280 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.371318 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.371329 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.371346 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.371357 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:23Z","lastTransitionTime":"2025-10-06T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.473556 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.473599 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.473608 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.473624 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.473634 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:23Z","lastTransitionTime":"2025-10-06T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.575752 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.575799 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.575809 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.575825 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.575837 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:23Z","lastTransitionTime":"2025-10-06T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.678030 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.678067 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.678078 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.678094 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.678105 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:23Z","lastTransitionTime":"2025-10-06T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.779670 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.779713 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.779725 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.779743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.779754 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:23Z","lastTransitionTime":"2025-10-06T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.882448 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.882496 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.882509 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.882524 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.882537 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:23Z","lastTransitionTime":"2025-10-06T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.984910 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.984951 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.984960 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.984978 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:23 crc kubenswrapper[4867]: I1006 13:05:23.984988 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:23Z","lastTransitionTime":"2025-10-06T13:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.087298 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.087336 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.087346 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.087358 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.087367 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:24Z","lastTransitionTime":"2025-10-06T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.189433 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.189500 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.189522 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.189550 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.189576 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:24Z","lastTransitionTime":"2025-10-06T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.292218 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.292314 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.292338 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.292368 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.292390 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:24Z","lastTransitionTime":"2025-10-06T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.394705 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.394743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.394757 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.394772 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.394781 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:24Z","lastTransitionTime":"2025-10-06T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.497146 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.497189 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.497198 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.497211 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.497221 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:24Z","lastTransitionTime":"2025-10-06T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.599531 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.599577 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.599588 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.599606 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.599617 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:24Z","lastTransitionTime":"2025-10-06T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.702541 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.702574 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.702583 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.702597 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.702608 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:24Z","lastTransitionTime":"2025-10-06T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.805053 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.805091 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.805098 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.805112 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.805126 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:24Z","lastTransitionTime":"2025-10-06T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.907463 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.907530 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.907542 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.907558 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:24 crc kubenswrapper[4867]: I1006 13:05:24.907568 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:24Z","lastTransitionTime":"2025-10-06T13:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.010728 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.010813 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.010849 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.010933 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.010969 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:25Z","lastTransitionTime":"2025-10-06T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.113729 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.113814 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.113826 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.113843 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.113855 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:25Z","lastTransitionTime":"2025-10-06T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.216217 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.216283 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.216293 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.216307 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.216317 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:25Z","lastTransitionTime":"2025-10-06T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.220668 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.220688 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.220704 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.220706 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:25 crc kubenswrapper[4867]: E1006 13:05:25.220800 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:25 crc kubenswrapper[4867]: E1006 13:05:25.220903 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:25 crc kubenswrapper[4867]: E1006 13:05:25.220964 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:25 crc kubenswrapper[4867]: E1006 13:05:25.221052 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.318804 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.318848 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.318858 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.318872 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.318880 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:25Z","lastTransitionTime":"2025-10-06T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.420650 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.420703 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.420712 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.420726 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.420735 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:25Z","lastTransitionTime":"2025-10-06T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.523082 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.523133 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.523143 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.523161 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.523171 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:25Z","lastTransitionTime":"2025-10-06T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.625782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.625813 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.625822 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.625838 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.625849 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:25Z","lastTransitionTime":"2025-10-06T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.727846 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.727888 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.727900 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.727917 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.727927 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:25Z","lastTransitionTime":"2025-10-06T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.830174 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.830213 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.830236 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.830298 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.830312 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:25Z","lastTransitionTime":"2025-10-06T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.932649 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.932719 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.932743 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.932770 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:25 crc kubenswrapper[4867]: I1006 13:05:25.932791 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:25Z","lastTransitionTime":"2025-10-06T13:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.035131 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.035191 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.035213 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.035241 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.035300 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:26Z","lastTransitionTime":"2025-10-06T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.137619 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.137663 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.137679 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.137699 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.137714 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:26Z","lastTransitionTime":"2025-10-06T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.240555 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.240600 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.240613 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.240629 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.240639 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:26Z","lastTransitionTime":"2025-10-06T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.342809 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.342868 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.342880 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.342899 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.342910 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:26Z","lastTransitionTime":"2025-10-06T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.444806 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.444854 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.444868 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.444883 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.444894 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:26Z","lastTransitionTime":"2025-10-06T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.546752 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.546779 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.546788 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.546803 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.546812 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:26Z","lastTransitionTime":"2025-10-06T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.648834 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.648870 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.648883 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.648899 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.648911 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:26Z","lastTransitionTime":"2025-10-06T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.750903 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.751465 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.751549 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.751646 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.751731 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:26Z","lastTransitionTime":"2025-10-06T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.854073 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.854348 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.854427 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.854495 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.854566 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:26Z","lastTransitionTime":"2025-10-06T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.956291 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.956340 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.956350 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.956364 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:26 crc kubenswrapper[4867]: I1006 13:05:26.956374 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:26Z","lastTransitionTime":"2025-10-06T13:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.058622 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.058672 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.058685 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.058701 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.058712 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:27Z","lastTransitionTime":"2025-10-06T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.160760 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.160813 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.160826 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.160843 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.160856 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:27Z","lastTransitionTime":"2025-10-06T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.220434 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.220504 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:27 crc kubenswrapper[4867]: E1006 13:05:27.220548 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.220574 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:27 crc kubenswrapper[4867]: E1006 13:05:27.220634 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.220526 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:27 crc kubenswrapper[4867]: E1006 13:05:27.220700 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:27 crc kubenswrapper[4867]: E1006 13:05:27.220710 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.262972 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.263002 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.263010 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.263023 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.263032 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:27Z","lastTransitionTime":"2025-10-06T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.365314 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.365348 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.365356 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.365367 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.365375 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:27Z","lastTransitionTime":"2025-10-06T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.467355 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.467401 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.467412 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.467427 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.467439 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:27Z","lastTransitionTime":"2025-10-06T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.569824 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.569866 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.569878 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.569897 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.569909 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:27Z","lastTransitionTime":"2025-10-06T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.671915 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.671953 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.671962 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.671974 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.671983 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:27Z","lastTransitionTime":"2025-10-06T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.774341 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.774393 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.774403 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.774417 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.774427 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:27Z","lastTransitionTime":"2025-10-06T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.876303 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.876349 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.876360 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.876374 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.876383 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:27Z","lastTransitionTime":"2025-10-06T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.978548 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.978590 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.978599 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.978613 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:27 crc kubenswrapper[4867]: I1006 13:05:27.978621 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:27Z","lastTransitionTime":"2025-10-06T13:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.081276 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.081323 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.081334 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.081351 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.081363 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:28Z","lastTransitionTime":"2025-10-06T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.184053 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.184097 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.184105 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.184120 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.184131 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:28Z","lastTransitionTime":"2025-10-06T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.286652 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.286696 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.286705 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.286718 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.286729 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:28Z","lastTransitionTime":"2025-10-06T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.388847 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.388915 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.388927 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.388945 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.388957 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:28Z","lastTransitionTime":"2025-10-06T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.491236 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.491318 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.491331 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.491349 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.491361 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:28Z","lastTransitionTime":"2025-10-06T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.593981 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.594019 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.594027 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.594042 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.594050 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:28Z","lastTransitionTime":"2025-10-06T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.696271 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.696333 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.696344 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.696360 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.696372 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:28Z","lastTransitionTime":"2025-10-06T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.799740 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.799782 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.799792 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.799808 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.799823 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:28Z","lastTransitionTime":"2025-10-06T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.902483 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.902523 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.902573 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.902594 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:28 crc kubenswrapper[4867]: I1006 13:05:28.902607 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:28Z","lastTransitionTime":"2025-10-06T13:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.004949 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.004985 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.005013 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.005027 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.005036 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:29Z","lastTransitionTime":"2025-10-06T13:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.107437 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.107483 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.107492 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.107509 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.107518 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:29Z","lastTransitionTime":"2025-10-06T13:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.139524 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.139560 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.139569 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.139583 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.139594 4867 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T13:05:29Z","lastTransitionTime":"2025-10-06T13:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.185148 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl"] Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.185490 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.187363 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.187645 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.187833 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.189711 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.220759 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.220791 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:29 crc kubenswrapper[4867]: E1006 13:05:29.220868 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.220883 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.220928 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:29 crc kubenswrapper[4867]: E1006 13:05:29.221064 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:29 crc kubenswrapper[4867]: E1006 13:05:29.221196 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:29 crc kubenswrapper[4867]: E1006 13:05:29.221238 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.257828 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d9218760-98fa-4100-b874-4653199c605b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.257900 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9218760-98fa-4100-b874-4653199c605b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.257982 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9218760-98fa-4100-b874-4653199c605b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.258042 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d9218760-98fa-4100-b874-4653199c605b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.258137 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d9218760-98fa-4100-b874-4653199c605b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.359389 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9218760-98fa-4100-b874-4653199c605b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.359444 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d9218760-98fa-4100-b874-4653199c605b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.359479 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d9218760-98fa-4100-b874-4653199c605b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.359507 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d9218760-98fa-4100-b874-4653199c605b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.359523 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9218760-98fa-4100-b874-4653199c605b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.359637 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d9218760-98fa-4100-b874-4653199c605b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.359738 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d9218760-98fa-4100-b874-4653199c605b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.360503 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d9218760-98fa-4100-b874-4653199c605b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.365099 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9218760-98fa-4100-b874-4653199c605b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.374943 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9218760-98fa-4100-b874-4653199c605b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9qvsl\" (UID: \"d9218760-98fa-4100-b874-4653199c605b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.500370 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" Oct 06 13:05:29 crc kubenswrapper[4867]: W1006 13:05:29.514183 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9218760_98fa_4100_b874_4653199c605b.slice/crio-3c66743370f9aef8e7f741b953da5a6a6ac1240f97f52cdb69899e086cf8faec WatchSource:0}: Error finding container 3c66743370f9aef8e7f741b953da5a6a6ac1240f97f52cdb69899e086cf8faec: Status 404 returned error can't find the container with id 3c66743370f9aef8e7f741b953da5a6a6ac1240f97f52cdb69899e086cf8faec Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.748908 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" event={"ID":"d9218760-98fa-4100-b874-4653199c605b","Type":"ContainerStarted","Data":"b2db8f75f6c02c9f5b2cd6726f6ff0258525807e329c77f8491735ba7aef979d"} Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.749150 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" event={"ID":"d9218760-98fa-4100-b874-4653199c605b","Type":"ContainerStarted","Data":"3c66743370f9aef8e7f741b953da5a6a6ac1240f97f52cdb69899e086cf8faec"} Oct 06 13:05:29 crc kubenswrapper[4867]: I1006 13:05:29.761470 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9qvsl" podStartSLOduration=88.76145536 podStartE2EDuration="1m28.76145536s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:05:29.759987672 +0000 UTC m=+109.217935816" watchObservedRunningTime="2025-10-06 13:05:29.76145536 +0000 UTC m=+109.219403504" Oct 06 13:05:31 crc kubenswrapper[4867]: I1006 13:05:31.220605 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:31 crc kubenswrapper[4867]: I1006 13:05:31.220710 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:31 crc kubenswrapper[4867]: E1006 13:05:31.221496 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:31 crc kubenswrapper[4867]: I1006 13:05:31.221702 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:31 crc kubenswrapper[4867]: I1006 13:05:31.222019 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:31 crc kubenswrapper[4867]: E1006 13:05:31.222148 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:31 crc kubenswrapper[4867]: I1006 13:05:31.222242 4867 scope.go:117] "RemoveContainer" containerID="ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c" Oct 06 13:05:31 crc kubenswrapper[4867]: E1006 13:05:31.222384 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zlc7z_openshift-ovn-kubernetes(93569a52-4f36-4017-9834-b3651d6cd63e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" Oct 06 13:05:31 crc kubenswrapper[4867]: E1006 13:05:31.222425 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:31 crc kubenswrapper[4867]: E1006 13:05:31.222478 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:33 crc kubenswrapper[4867]: I1006 13:05:33.221143 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:33 crc kubenswrapper[4867]: I1006 13:05:33.221224 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:33 crc kubenswrapper[4867]: I1006 13:05:33.221156 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:33 crc kubenswrapper[4867]: E1006 13:05:33.221440 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:33 crc kubenswrapper[4867]: I1006 13:05:33.221460 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:33 crc kubenswrapper[4867]: E1006 13:05:33.221581 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:33 crc kubenswrapper[4867]: E1006 13:05:33.221677 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:33 crc kubenswrapper[4867]: E1006 13:05:33.221806 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:35 crc kubenswrapper[4867]: I1006 13:05:35.220335 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:35 crc kubenswrapper[4867]: I1006 13:05:35.220455 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:35 crc kubenswrapper[4867]: I1006 13:05:35.220509 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:35 crc kubenswrapper[4867]: E1006 13:05:35.220478 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:35 crc kubenswrapper[4867]: E1006 13:05:35.220567 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:35 crc kubenswrapper[4867]: I1006 13:05:35.220583 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:35 crc kubenswrapper[4867]: E1006 13:05:35.220850 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:35 crc kubenswrapper[4867]: E1006 13:05:35.220932 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:35 crc kubenswrapper[4867]: I1006 13:05:35.764623 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-knnfm_8e3bebeb-f8c1-4b1e-a320-b937eced1c3a/kube-multus/1.log" Oct 06 13:05:35 crc kubenswrapper[4867]: I1006 13:05:35.765137 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-knnfm_8e3bebeb-f8c1-4b1e-a320-b937eced1c3a/kube-multus/0.log" Oct 06 13:05:35 crc kubenswrapper[4867]: I1006 13:05:35.765191 4867 generic.go:334] "Generic (PLEG): container finished" podID="8e3bebeb-f8c1-4b1e-a320-b937eced1c3a" containerID="5db25e15e7d1c63ff17fb8979f10d3d86b38b65942e72251cacf977ff9d031b4" exitCode=1 Oct 06 13:05:35 crc kubenswrapper[4867]: I1006 13:05:35.765237 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-knnfm" event={"ID":"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a","Type":"ContainerDied","Data":"5db25e15e7d1c63ff17fb8979f10d3d86b38b65942e72251cacf977ff9d031b4"} Oct 06 13:05:35 crc kubenswrapper[4867]: I1006 13:05:35.765345 4867 scope.go:117] "RemoveContainer" containerID="2f8f4a11ff2013b07c62fa00df8880dd32698d52756b600a7eb0702268007771" Oct 06 13:05:35 crc kubenswrapper[4867]: I1006 13:05:35.765770 4867 scope.go:117] "RemoveContainer" containerID="5db25e15e7d1c63ff17fb8979f10d3d86b38b65942e72251cacf977ff9d031b4" Oct 06 13:05:35 crc kubenswrapper[4867]: E1006 13:05:35.766004 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-knnfm_openshift-multus(8e3bebeb-f8c1-4b1e-a320-b937eced1c3a)\"" pod="openshift-multus/multus-knnfm" podUID="8e3bebeb-f8c1-4b1e-a320-b937eced1c3a" Oct 06 13:05:36 crc kubenswrapper[4867]: I1006 13:05:36.769071 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-knnfm_8e3bebeb-f8c1-4b1e-a320-b937eced1c3a/kube-multus/1.log" Oct 06 13:05:37 crc kubenswrapper[4867]: I1006 13:05:37.221072 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:37 crc kubenswrapper[4867]: E1006 13:05:37.221780 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:37 crc kubenswrapper[4867]: I1006 13:05:37.221156 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:37 crc kubenswrapper[4867]: E1006 13:05:37.222287 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:37 crc kubenswrapper[4867]: I1006 13:05:37.221088 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:37 crc kubenswrapper[4867]: E1006 13:05:37.222497 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:37 crc kubenswrapper[4867]: I1006 13:05:37.221169 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:37 crc kubenswrapper[4867]: E1006 13:05:37.222759 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:39 crc kubenswrapper[4867]: I1006 13:05:39.220497 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:39 crc kubenswrapper[4867]: E1006 13:05:39.220613 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:39 crc kubenswrapper[4867]: I1006 13:05:39.220504 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:39 crc kubenswrapper[4867]: I1006 13:05:39.220508 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:39 crc kubenswrapper[4867]: E1006 13:05:39.220724 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:39 crc kubenswrapper[4867]: E1006 13:05:39.220803 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:39 crc kubenswrapper[4867]: I1006 13:05:39.220497 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:39 crc kubenswrapper[4867]: E1006 13:05:39.220878 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:41 crc kubenswrapper[4867]: E1006 13:05:41.202199 4867 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 06 13:05:41 crc kubenswrapper[4867]: I1006 13:05:41.220996 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:41 crc kubenswrapper[4867]: I1006 13:05:41.221071 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:41 crc kubenswrapper[4867]: I1006 13:05:41.221153 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:41 crc kubenswrapper[4867]: E1006 13:05:41.222061 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:41 crc kubenswrapper[4867]: I1006 13:05:41.222158 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:41 crc kubenswrapper[4867]: E1006 13:05:41.222229 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:41 crc kubenswrapper[4867]: E1006 13:05:41.222948 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:41 crc kubenswrapper[4867]: E1006 13:05:41.223043 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:41 crc kubenswrapper[4867]: E1006 13:05:41.341853 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 13:05:43 crc kubenswrapper[4867]: I1006 13:05:43.220797 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:43 crc kubenswrapper[4867]: I1006 13:05:43.220809 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:43 crc kubenswrapper[4867]: I1006 13:05:43.220838 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:43 crc kubenswrapper[4867]: I1006 13:05:43.220918 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:43 crc kubenswrapper[4867]: E1006 13:05:43.221841 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:43 crc kubenswrapper[4867]: E1006 13:05:43.221972 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:43 crc kubenswrapper[4867]: E1006 13:05:43.222026 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:43 crc kubenswrapper[4867]: E1006 13:05:43.222108 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:45 crc kubenswrapper[4867]: I1006 13:05:45.220233 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:45 crc kubenswrapper[4867]: I1006 13:05:45.220435 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:45 crc kubenswrapper[4867]: I1006 13:05:45.220438 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:45 crc kubenswrapper[4867]: I1006 13:05:45.220478 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:45 crc kubenswrapper[4867]: E1006 13:05:45.220530 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:45 crc kubenswrapper[4867]: E1006 13:05:45.220670 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:45 crc kubenswrapper[4867]: E1006 13:05:45.221217 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:45 crc kubenswrapper[4867]: E1006 13:05:45.221447 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:45 crc kubenswrapper[4867]: I1006 13:05:45.222021 4867 scope.go:117] "RemoveContainer" containerID="ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c" Oct 06 13:05:45 crc kubenswrapper[4867]: I1006 13:05:45.799343 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/3.log" Oct 06 13:05:45 crc kubenswrapper[4867]: I1006 13:05:45.801954 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerStarted","Data":"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69"} Oct 06 13:05:45 crc kubenswrapper[4867]: I1006 13:05:45.802529 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:05:45 crc kubenswrapper[4867]: I1006 13:05:45.831938 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podStartSLOduration=104.831918144 podStartE2EDuration="1m44.831918144s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:05:45.831645187 +0000 UTC m=+125.289593331" watchObservedRunningTime="2025-10-06 13:05:45.831918144 +0000 UTC m=+125.289866308" Oct 06 13:05:46 crc kubenswrapper[4867]: I1006 13:05:46.067203 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8t2sq"] Oct 06 13:05:46 crc kubenswrapper[4867]: I1006 13:05:46.067344 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:46 crc kubenswrapper[4867]: E1006 13:05:46.067430 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:46 crc kubenswrapper[4867]: E1006 13:05:46.343345 4867 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 13:05:47 crc kubenswrapper[4867]: I1006 13:05:47.220236 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:47 crc kubenswrapper[4867]: I1006 13:05:47.220245 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:47 crc kubenswrapper[4867]: I1006 13:05:47.220458 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:47 crc kubenswrapper[4867]: E1006 13:05:47.220445 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:47 crc kubenswrapper[4867]: E1006 13:05:47.220545 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:47 crc kubenswrapper[4867]: I1006 13:05:47.220624 4867 scope.go:117] "RemoveContainer" containerID="5db25e15e7d1c63ff17fb8979f10d3d86b38b65942e72251cacf977ff9d031b4" Oct 06 13:05:47 crc kubenswrapper[4867]: E1006 13:05:47.220701 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:47 crc kubenswrapper[4867]: I1006 13:05:47.810146 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-knnfm_8e3bebeb-f8c1-4b1e-a320-b937eced1c3a/kube-multus/1.log" Oct 06 13:05:47 crc kubenswrapper[4867]: I1006 13:05:47.810196 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-knnfm" event={"ID":"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a","Type":"ContainerStarted","Data":"069d91feced47fa7dd985fd0691e86c74dc903221691f3caea33965e465d529f"} Oct 06 13:05:48 crc kubenswrapper[4867]: I1006 13:05:48.220746 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:48 crc kubenswrapper[4867]: E1006 13:05:48.220995 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:49 crc kubenswrapper[4867]: I1006 13:05:49.221036 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:49 crc kubenswrapper[4867]: E1006 13:05:49.221457 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:49 crc kubenswrapper[4867]: I1006 13:05:49.221111 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:49 crc kubenswrapper[4867]: E1006 13:05:49.221586 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:49 crc kubenswrapper[4867]: I1006 13:05:49.221070 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:49 crc kubenswrapper[4867]: E1006 13:05:49.221907 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:50 crc kubenswrapper[4867]: I1006 13:05:50.220795 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:50 crc kubenswrapper[4867]: E1006 13:05:50.220939 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8t2sq" podUID="b78c9415-85bd-40db-b44f-f1e04797a66e" Oct 06 13:05:51 crc kubenswrapper[4867]: I1006 13:05:51.220244 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:51 crc kubenswrapper[4867]: I1006 13:05:51.220343 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:51 crc kubenswrapper[4867]: I1006 13:05:51.220775 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:51 crc kubenswrapper[4867]: E1006 13:05:51.221711 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 13:05:51 crc kubenswrapper[4867]: E1006 13:05:51.222327 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 13:05:51 crc kubenswrapper[4867]: E1006 13:05:51.222436 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 13:05:52 crc kubenswrapper[4867]: I1006 13:05:52.220807 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:05:52 crc kubenswrapper[4867]: I1006 13:05:52.222764 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 06 13:05:52 crc kubenswrapper[4867]: I1006 13:05:52.223057 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 06 13:05:53 crc kubenswrapper[4867]: I1006 13:05:53.220979 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:05:53 crc kubenswrapper[4867]: I1006 13:05:53.221070 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:05:53 crc kubenswrapper[4867]: I1006 13:05:53.220985 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:05:53 crc kubenswrapper[4867]: I1006 13:05:53.223198 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 06 13:05:53 crc kubenswrapper[4867]: I1006 13:05:53.223266 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 06 13:05:53 crc kubenswrapper[4867]: I1006 13:05:53.223452 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 06 13:05:53 crc kubenswrapper[4867]: I1006 13:05:53.223572 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.732939 4867 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.775536 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pklsp"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.776559 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.783096 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.783425 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bf8xq"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.783988 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.784135 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.784306 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.784542 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.784674 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.784680 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.784132 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.785692 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.785746 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.786626 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.786700 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.788681 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.789660 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.790617 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.791287 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.793453 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.793701 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.794486 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.795133 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.795656 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.796278 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8lg7n"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.797621 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.799791 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.800699 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-75xtb"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.800836 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.803285 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vf59b"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.803608 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.803813 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2vpqs"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.804214 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2vpqs" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.804599 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.804724 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.805117 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.805542 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.805919 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.805940 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.806014 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.806084 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.806228 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.805948 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.806424 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.806588 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-x4d8v"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.806601 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.807325 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-x4d8v" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.807321 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.807695 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.808186 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.808784 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j646b"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.809162 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.815686 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.816454 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.816974 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.817237 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.817575 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.818841 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.820292 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.824819 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.826018 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.835068 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6h9ff"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.836014 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ftjpf"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.836691 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.836694 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.837806 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.840610 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.840807 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.840907 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.840942 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.841002 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.841066 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.841203 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.841530 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.841671 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.841842 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.841874 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.842181 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.842356 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.842444 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.842515 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.842589 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.842656 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.842727 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.842802 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.842885 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.837815 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.842973 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.836777 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.836831 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.838087 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.843485 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.838159 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.838174 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.843593 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.838208 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.838243 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.838295 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.843656 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.838323 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.838376 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.838405 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.838422 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.838456 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.838505 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.838514 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.839021 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.844190 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.845547 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-87z4s"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.846370 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.846668 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-svjt7"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.847188 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.848652 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rqnc4"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.849267 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.853643 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.853827 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.853916 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.854000 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.854045 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.854092 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.853930 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.853938 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.854162 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.856027 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.857359 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.858098 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.858400 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.858696 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.858812 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.858924 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.859552 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.859681 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.859763 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.859824 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.860130 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.860273 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.860401 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.861310 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.862086 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.862105 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.862739 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.863318 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.864087 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.865368 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.867025 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.872049 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.904049 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.904744 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hzns5"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.905730 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zl588"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.906779 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hzns5" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.907144 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.907804 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.909687 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.910343 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.910466 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.915397 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.919697 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.920494 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.920725 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dclxp"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.921434 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dclxp" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.922704 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.922956 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.923430 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.923613 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.924608 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.926678 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.927521 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.927764 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xj8lg"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.928226 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.929566 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ggkgn"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.930570 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kv4q6"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.930725 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ggkgn" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.931028 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kv4q6" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.931734 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.932291 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.932713 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fp56z"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.933143 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fp56z" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.934439 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.935054 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.936185 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.937279 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pklsp"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.937300 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.938796 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.941229 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.941299 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.942999 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.944658 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.946481 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-x4d8v"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.947273 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-rff6q"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.948123 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rff6q" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.948650 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j646b"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.949976 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.952125 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.953871 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vf59b"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.955578 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rqnc4"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.956955 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.958202 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hzns5"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.958736 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.959754 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.961378 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.964941 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-87z4s"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.971322 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.974647 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8lg7n"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.977375 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.978611 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bf8xq"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.979145 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.979828 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dclxp"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980426 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2b9m\" (UniqueName: \"kubernetes.io/projected/d4a6a1dc-585a-43b9-afbe-d5054a71e70e-kube-api-access-r2b9m\") pod \"openshift-controller-manager-operator-756b6f6bc6-ncwsb\" (UID: \"d4a6a1dc-585a-43b9-afbe-d5054a71e70e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980464 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-etcd-serving-ca\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980504 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980524 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a370043f-0cb1-4279-8ebd-d3b15ef2980a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h5sdt\" (UID: \"a370043f-0cb1-4279-8ebd-d3b15ef2980a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980544 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b0dc85d-57e3-49f7-89b3-a89daad03f39-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rwkhc\" (UID: \"4b0dc85d-57e3-49f7-89b3-a89daad03f39\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980566 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b0dc85d-57e3-49f7-89b3-a89daad03f39-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rwkhc\" (UID: \"4b0dc85d-57e3-49f7-89b3-a89daad03f39\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980588 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts7b6\" (UniqueName: \"kubernetes.io/projected/8826a928-e7d1-4cb1-bd08-69849ee5a12b-kube-api-access-ts7b6\") pod \"downloads-7954f5f757-x4d8v\" (UID: \"8826a928-e7d1-4cb1-bd08-69849ee5a12b\") " pod="openshift-console/downloads-7954f5f757-x4d8v" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980619 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-serving-cert\") pod \"route-controller-manager-6576b87f9c-m6n6l\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980649 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv2h6\" (UniqueName: \"kubernetes.io/projected/21b2c8cf-2109-4663-bdb0-b106d2a4c548-kube-api-access-gv2h6\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980673 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac64bd06-d021-43c4-8f20-bff80a406d77-config\") pod \"console-operator-58897d9998-vf59b\" (UID: \"ac64bd06-d021-43c4-8f20-bff80a406d77\") " pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980698 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/21b2c8cf-2109-4663-bdb0-b106d2a4c548-node-pullsecrets\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980715 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21b2c8cf-2109-4663-bdb0-b106d2a4c548-serving-cert\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980739 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4a6a1dc-585a-43b9-afbe-d5054a71e70e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ncwsb\" (UID: \"d4a6a1dc-585a-43b9-afbe-d5054a71e70e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980760 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-client-ca\") pod \"route-controller-manager-6576b87f9c-m6n6l\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980785 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21b2c8cf-2109-4663-bdb0-b106d2a4c548-etcd-client\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980807 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-image-import-ca\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980831 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a6a1dc-585a-43b9-afbe-d5054a71e70e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ncwsb\" (UID: \"d4a6a1dc-585a-43b9-afbe-d5054a71e70e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980868 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djgql\" (UniqueName: \"kubernetes.io/projected/4b0dc85d-57e3-49f7-89b3-a89daad03f39-kube-api-access-djgql\") pod \"cluster-image-registry-operator-dc59b4c8b-rwkhc\" (UID: \"4b0dc85d-57e3-49f7-89b3-a89daad03f39\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980892 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l85p\" (UniqueName: \"kubernetes.io/projected/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-kube-api-access-2l85p\") pod \"route-controller-manager-6576b87f9c-m6n6l\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980916 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-audit\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980939 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8bbc\" (UniqueName: \"kubernetes.io/projected/ac64bd06-d021-43c4-8f20-bff80a406d77-kube-api-access-k8bbc\") pod \"console-operator-58897d9998-vf59b\" (UID: \"ac64bd06-d021-43c4-8f20-bff80a406d77\") " pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.980979 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a370043f-0cb1-4279-8ebd-d3b15ef2980a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h5sdt\" (UID: \"a370043f-0cb1-4279-8ebd-d3b15ef2980a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.981008 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-config\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.981046 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac64bd06-d021-43c4-8f20-bff80a406d77-serving-cert\") pod \"console-operator-58897d9998-vf59b\" (UID: \"ac64bd06-d021-43c4-8f20-bff80a406d77\") " pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.981072 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7862\" (UniqueName: \"kubernetes.io/projected/a370043f-0cb1-4279-8ebd-d3b15ef2980a-kube-api-access-h7862\") pod \"openshift-apiserver-operator-796bbdcf4f-h5sdt\" (UID: \"a370043f-0cb1-4279-8ebd-d3b15ef2980a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.981097 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-config\") pod \"route-controller-manager-6576b87f9c-m6n6l\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.981125 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b0dc85d-57e3-49f7-89b3-a89daad03f39-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rwkhc\" (UID: \"4b0dc85d-57e3-49f7-89b3-a89daad03f39\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.981149 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/21b2c8cf-2109-4663-bdb0-b106d2a4c548-encryption-config\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.981171 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21b2c8cf-2109-4663-bdb0-b106d2a4c548-audit-dir\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.981194 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac64bd06-d021-43c4-8f20-bff80a406d77-trusted-ca\") pod \"console-operator-58897d9998-vf59b\" (UID: \"ac64bd06-d021-43c4-8f20-bff80a406d77\") " pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.992421 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6h9ff"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.992491 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.995971 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-75xtb"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.996096 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd"] Oct 06 13:05:59 crc kubenswrapper[4867]: I1006 13:05:59.997343 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zl588"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.000904 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2vpqs"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.000966 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4k5tp"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.001962 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4k5tp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.002312 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.003104 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ftjpf"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.004322 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.006466 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fp56z"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.006565 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.009320 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ggkgn"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.009409 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.022034 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.022088 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4k5tp"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.022099 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kv4q6"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.029658 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.030338 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xj8lg"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.031554 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-zpvzr"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.032781 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.043414 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-zpvzr"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.043550 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.059133 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.079347 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094170 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-serving-cert\") pod \"route-controller-manager-6576b87f9c-m6n6l\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094221 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac64bd06-d021-43c4-8f20-bff80a406d77-config\") pod \"console-operator-58897d9998-vf59b\" (UID: \"ac64bd06-d021-43c4-8f20-bff80a406d77\") " pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094246 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv2h6\" (UniqueName: \"kubernetes.io/projected/21b2c8cf-2109-4663-bdb0-b106d2a4c548-kube-api-access-gv2h6\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094287 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/21b2c8cf-2109-4663-bdb0-b106d2a4c548-node-pullsecrets\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094304 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21b2c8cf-2109-4663-bdb0-b106d2a4c548-serving-cert\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094325 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4a6a1dc-585a-43b9-afbe-d5054a71e70e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ncwsb\" (UID: \"d4a6a1dc-585a-43b9-afbe-d5054a71e70e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094340 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-client-ca\") pod \"route-controller-manager-6576b87f9c-m6n6l\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094355 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21b2c8cf-2109-4663-bdb0-b106d2a4c548-etcd-client\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094371 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-image-import-ca\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094399 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a6a1dc-585a-43b9-afbe-d5054a71e70e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ncwsb\" (UID: \"d4a6a1dc-585a-43b9-afbe-d5054a71e70e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094420 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djgql\" (UniqueName: \"kubernetes.io/projected/4b0dc85d-57e3-49f7-89b3-a89daad03f39-kube-api-access-djgql\") pod \"cluster-image-registry-operator-dc59b4c8b-rwkhc\" (UID: \"4b0dc85d-57e3-49f7-89b3-a89daad03f39\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094441 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l85p\" (UniqueName: \"kubernetes.io/projected/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-kube-api-access-2l85p\") pod \"route-controller-manager-6576b87f9c-m6n6l\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094462 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8bbc\" (UniqueName: \"kubernetes.io/projected/ac64bd06-d021-43c4-8f20-bff80a406d77-kube-api-access-k8bbc\") pod \"console-operator-58897d9998-vf59b\" (UID: \"ac64bd06-d021-43c4-8f20-bff80a406d77\") " pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094481 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-audit\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094509 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a370043f-0cb1-4279-8ebd-d3b15ef2980a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h5sdt\" (UID: \"a370043f-0cb1-4279-8ebd-d3b15ef2980a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094526 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-config\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094553 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac64bd06-d021-43c4-8f20-bff80a406d77-serving-cert\") pod \"console-operator-58897d9998-vf59b\" (UID: \"ac64bd06-d021-43c4-8f20-bff80a406d77\") " pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094569 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-config\") pod \"route-controller-manager-6576b87f9c-m6n6l\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094587 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7862\" (UniqueName: \"kubernetes.io/projected/a370043f-0cb1-4279-8ebd-d3b15ef2980a-kube-api-access-h7862\") pod \"openshift-apiserver-operator-796bbdcf4f-h5sdt\" (UID: \"a370043f-0cb1-4279-8ebd-d3b15ef2980a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094608 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b0dc85d-57e3-49f7-89b3-a89daad03f39-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rwkhc\" (UID: \"4b0dc85d-57e3-49f7-89b3-a89daad03f39\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094630 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/21b2c8cf-2109-4663-bdb0-b106d2a4c548-encryption-config\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094654 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21b2c8cf-2109-4663-bdb0-b106d2a4c548-audit-dir\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094678 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac64bd06-d021-43c4-8f20-bff80a406d77-trusted-ca\") pod \"console-operator-58897d9998-vf59b\" (UID: \"ac64bd06-d021-43c4-8f20-bff80a406d77\") " pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094713 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2b9m\" (UniqueName: \"kubernetes.io/projected/d4a6a1dc-585a-43b9-afbe-d5054a71e70e-kube-api-access-r2b9m\") pod \"openshift-controller-manager-operator-756b6f6bc6-ncwsb\" (UID: \"d4a6a1dc-585a-43b9-afbe-d5054a71e70e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094734 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-etcd-serving-ca\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094754 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094775 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a370043f-0cb1-4279-8ebd-d3b15ef2980a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h5sdt\" (UID: \"a370043f-0cb1-4279-8ebd-d3b15ef2980a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094797 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b0dc85d-57e3-49f7-89b3-a89daad03f39-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rwkhc\" (UID: \"4b0dc85d-57e3-49f7-89b3-a89daad03f39\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094831 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b0dc85d-57e3-49f7-89b3-a89daad03f39-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rwkhc\" (UID: \"4b0dc85d-57e3-49f7-89b3-a89daad03f39\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.094855 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts7b6\" (UniqueName: \"kubernetes.io/projected/8826a928-e7d1-4cb1-bd08-69849ee5a12b-kube-api-access-ts7b6\") pod \"downloads-7954f5f757-x4d8v\" (UID: \"8826a928-e7d1-4cb1-bd08-69849ee5a12b\") " pod="openshift-console/downloads-7954f5f757-x4d8v" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.095384 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ghwjb"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.095650 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/21b2c8cf-2109-4663-bdb0-b106d2a4c548-node-pullsecrets\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.095845 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac64bd06-d021-43c4-8f20-bff80a406d77-config\") pod \"console-operator-58897d9998-vf59b\" (UID: \"ac64bd06-d021-43c4-8f20-bff80a406d77\") " pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.096819 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ghwjb" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.096846 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-image-import-ca\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.096628 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a6a1dc-585a-43b9-afbe-d5054a71e70e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ncwsb\" (UID: \"d4a6a1dc-585a-43b9-afbe-d5054a71e70e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.097763 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-audit\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.097905 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21b2c8cf-2109-4663-bdb0-b106d2a4c548-audit-dir\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.097938 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-etcd-serving-ca\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.097995 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a370043f-0cb1-4279-8ebd-d3b15ef2980a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h5sdt\" (UID: \"a370043f-0cb1-4279-8ebd-d3b15ef2980a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.098832 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.099474 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ac64bd06-d021-43c4-8f20-bff80a406d77-trusted-ca\") pod \"console-operator-58897d9998-vf59b\" (UID: \"ac64bd06-d021-43c4-8f20-bff80a406d77\") " pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.100221 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-config\") pod \"route-controller-manager-6576b87f9c-m6n6l\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.100297 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b2c8cf-2109-4663-bdb0-b106d2a4c548-config\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.100308 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21b2c8cf-2109-4663-bdb0-b106d2a4c548-serving-cert\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.100544 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-serving-cert\") pod \"route-controller-manager-6576b87f9c-m6n6l\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.100973 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-client-ca\") pod \"route-controller-manager-6576b87f9c-m6n6l\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.101201 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b0dc85d-57e3-49f7-89b3-a89daad03f39-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rwkhc\" (UID: \"4b0dc85d-57e3-49f7-89b3-a89daad03f39\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.101321 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac64bd06-d021-43c4-8f20-bff80a406d77-serving-cert\") pod \"console-operator-58897d9998-vf59b\" (UID: \"ac64bd06-d021-43c4-8f20-bff80a406d77\") " pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.101991 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b0dc85d-57e3-49f7-89b3-a89daad03f39-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rwkhc\" (UID: \"4b0dc85d-57e3-49f7-89b3-a89daad03f39\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.102774 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.103998 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21b2c8cf-2109-4663-bdb0-b106d2a4c548-etcd-client\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.104166 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4a6a1dc-585a-43b9-afbe-d5054a71e70e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ncwsb\" (UID: \"d4a6a1dc-585a-43b9-afbe-d5054a71e70e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.105062 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ghwjb"] Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.106328 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a370043f-0cb1-4279-8ebd-d3b15ef2980a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h5sdt\" (UID: \"a370043f-0cb1-4279-8ebd-d3b15ef2980a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.109350 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/21b2c8cf-2109-4663-bdb0-b106d2a4c548-encryption-config\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.118719 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.139754 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.158940 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.179708 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.198913 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.218496 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.239280 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.258856 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.279353 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.298892 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.319672 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.338715 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.359814 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.379897 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.398985 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.418719 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.439564 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.465520 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.479409 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.508234 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.519389 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.539564 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.559137 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.579698 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.599628 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.619234 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.639274 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.659845 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.699063 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.739095 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.759482 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.778799 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.799128 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.818924 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.838589 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.859516 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.879541 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.899390 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.919624 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.937773 4867 request.go:700] Waited for 1.017009475s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/configmaps?fieldSelector=metadata.name%3Dkube-controller-manager-operator-config&limit=500&resourceVersion=0 Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.939698 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.959573 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.979527 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 06 13:06:00 crc kubenswrapper[4867]: I1006 13:06:00.998929 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.019386 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.039350 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.059466 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.079370 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.099151 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.118908 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.138633 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.158538 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.178470 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.205535 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.219561 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.239449 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.259637 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.278837 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.299548 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.318864 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.339165 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.358738 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.379339 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.398401 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.419464 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.438441 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.458529 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.479393 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.500029 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.519625 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.539210 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.558960 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.579471 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.598759 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.619399 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.639495 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.659624 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.679662 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.699386 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.718413 4867 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.738490 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.772398 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts7b6\" (UniqueName: \"kubernetes.io/projected/8826a928-e7d1-4cb1-bd08-69849ee5a12b-kube-api-access-ts7b6\") pod \"downloads-7954f5f757-x4d8v\" (UID: \"8826a928-e7d1-4cb1-bd08-69849ee5a12b\") " pod="openshift-console/downloads-7954f5f757-x4d8v" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.797911 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2b9m\" (UniqueName: \"kubernetes.io/projected/d4a6a1dc-585a-43b9-afbe-d5054a71e70e-kube-api-access-r2b9m\") pod \"openshift-controller-manager-operator-756b6f6bc6-ncwsb\" (UID: \"d4a6a1dc-585a-43b9-afbe-d5054a71e70e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.815065 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv2h6\" (UniqueName: \"kubernetes.io/projected/21b2c8cf-2109-4663-bdb0-b106d2a4c548-kube-api-access-gv2h6\") pod \"apiserver-76f77b778f-pklsp\" (UID: \"21b2c8cf-2109-4663-bdb0-b106d2a4c548\") " pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.818983 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-x4d8v" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.831777 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7862\" (UniqueName: \"kubernetes.io/projected/a370043f-0cb1-4279-8ebd-d3b15ef2980a-kube-api-access-h7862\") pod \"openshift-apiserver-operator-796bbdcf4f-h5sdt\" (UID: \"a370043f-0cb1-4279-8ebd-d3b15ef2980a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.851642 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djgql\" (UniqueName: \"kubernetes.io/projected/4b0dc85d-57e3-49f7-89b3-a89daad03f39-kube-api-access-djgql\") pod \"cluster-image-registry-operator-dc59b4c8b-rwkhc\" (UID: \"4b0dc85d-57e3-49f7-89b3-a89daad03f39\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.859497 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.866976 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.879479 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.898977 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.919450 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.933108 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.955121 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l85p\" (UniqueName: \"kubernetes.io/projected/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-kube-api-access-2l85p\") pod \"route-controller-manager-6576b87f9c-m6n6l\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.957554 4867 request.go:700] Waited for 1.859629687s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/serviceaccounts/console-operator/token Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.965365 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.976756 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8bbc\" (UniqueName: \"kubernetes.io/projected/ac64bd06-d021-43c4-8f20-bff80a406d77-kube-api-access-k8bbc\") pod \"console-operator-58897d9998-vf59b\" (UID: \"ac64bd06-d021-43c4-8f20-bff80a406d77\") " pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.982738 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-x4d8v"] Oct 06 13:06:01 crc kubenswrapper[4867]: W1006 13:06:01.990179 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8826a928_e7d1_4cb1_bd08_69849ee5a12b.slice/crio-a81fef00c59b559a5d71f4fce83f68fb05e879443e358530a1eef8a644f36282 WatchSource:0}: Error finding container a81fef00c59b559a5d71f4fce83f68fb05e879443e358530a1eef8a644f36282: Status 404 returned error can't find the container with id a81fef00c59b559a5d71f4fce83f68fb05e879443e358530a1eef8a644f36282 Oct 06 13:06:01 crc kubenswrapper[4867]: I1006 13:06:01.997409 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b0dc85d-57e3-49f7-89b3-a89daad03f39-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rwkhc\" (UID: \"4b0dc85d-57e3-49f7-89b3-a89daad03f39\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.018034 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt"] Oct 06 13:06:02 crc kubenswrapper[4867]: W1006 13:06:02.023915 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda370043f_0cb1_4279_8ebd_d3b15ef2980a.slice/crio-7b9ffa0c5bbd3499d5687b90ea1baf57272f7986ccaf8d849dbaa52fc9f08d70 WatchSource:0}: Error finding container 7b9ffa0c5bbd3499d5687b90ea1baf57272f7986ccaf8d849dbaa52fc9f08d70: Status 404 returned error can't find the container with id 7b9ffa0c5bbd3499d5687b90ea1baf57272f7986ccaf8d849dbaa52fc9f08d70 Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.047110 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.093763 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.094092 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113404 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-default-certificate\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113464 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdkbv\" (UniqueName: \"kubernetes.io/projected/497dc758-b39a-4915-8eac-e4cb9e2e2a80-kube-api-access-gdkbv\") pod \"machine-config-operator-74547568cd-zkcnc\" (UID: \"497dc758-b39a-4915-8eac-e4cb9e2e2a80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113498 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113542 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsrtl\" (UniqueName: \"kubernetes.io/projected/8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70-kube-api-access-qsrtl\") pod \"ingress-operator-5b745b69d9-8clxg\" (UID: \"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113560 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113576 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-metrics-certs\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113616 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1e611b5d-e10c-4148-a5af-b62505d05e74-etcd-client\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113634 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5m2m\" (UniqueName: \"kubernetes.io/projected/1e611b5d-e10c-4148-a5af-b62505d05e74-kube-api-access-s5m2m\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113649 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-client-ca\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113663 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-serving-cert\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113687 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pjqs\" (UniqueName: \"kubernetes.io/projected/ed7648e1-d992-4263-9117-e50cd88a66a9-kube-api-access-2pjqs\") pod \"machine-api-operator-5694c8668f-8lg7n\" (UID: \"ed7648e1-d992-4263-9117-e50cd88a66a9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113757 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2044b33-c26a-4abb-b304-d7fa7eaaec71-metrics-tls\") pod \"dns-operator-744455d44c-2vpqs\" (UID: \"d2044b33-c26a-4abb-b304-d7fa7eaaec71\") " pod="openshift-dns-operator/dns-operator-744455d44c-2vpqs" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113774 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2e9bd61-4b86-4b8f-873e-6a143973f249-etcd-service-ca\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113790 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-service-ca-bundle\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113805 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-config\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113818 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-config\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113834 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/497dc758-b39a-4915-8eac-e4cb9e2e2a80-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zkcnc\" (UID: \"497dc758-b39a-4915-8eac-e4cb9e2e2a80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113849 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8clxg\" (UID: \"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113897 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-stats-auth\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113918 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113936 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f98ebfd5-26fa-49e3-a072-2578e344889b-config\") pod \"machine-approver-56656f9798-sqlvm\" (UID: \"f98ebfd5-26fa-49e3-a072-2578e344889b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113955 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113972 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/470ad2dd-46cd-49e5-ac59-032631bfcb0b-serving-cert\") pod \"openshift-config-operator-7777fb866f-75xtb\" (UID: \"470ad2dd-46cd-49e5-ac59-032631bfcb0b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.113987 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/048512de-ffeb-44c8-a11f-58513ae09db2-config\") pod \"kube-apiserver-operator-766d6c64bb-8vp8h\" (UID: \"048512de-ffeb-44c8-a11f-58513ae09db2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114003 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fa021310-c3a2-4feb-93a7-0b2eb6307147-audit-dir\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114018 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5d3717e-bfd0-4f99-9b43-f871cc3c1801-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkxnb\" (UID: \"a5d3717e-bfd0-4f99-9b43-f871cc3c1801\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114034 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70-metrics-tls\") pod \"ingress-operator-5b745b69d9-8clxg\" (UID: \"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114051 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-oauth-serving-cert\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114096 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2e9bd61-4b86-4b8f-873e-6a143973f249-serving-cert\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114113 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2htdv\" (UniqueName: \"kubernetes.io/projected/2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0-kube-api-access-2htdv\") pod \"cluster-samples-operator-665b6dd947-s9l2c\" (UID: \"2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114128 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b2e9bd61-4b86-4b8f-873e-6a143973f249-etcd-client\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114146 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-bound-sa-token\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114164 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114178 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7t46\" (UniqueName: \"kubernetes.io/projected/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-kube-api-access-f7t46\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114193 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkntm\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-kube-api-access-qkntm\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114208 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e9bd61-4b86-4b8f-873e-6a143973f249-config\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114224 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dpb2\" (UniqueName: \"kubernetes.io/projected/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-kube-api-access-6dpb2\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114239 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114269 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdldb\" (UniqueName: \"kubernetes.io/projected/8c451a50-f142-4702-91ac-987dc000746b-kube-api-access-zdldb\") pod \"olm-operator-6b444d44fb-h9grd\" (UID: \"8c451a50-f142-4702-91ac-987dc000746b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114284 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e611b5d-e10c-4148-a5af-b62505d05e74-serving-cert\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114299 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114316 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/048512de-ffeb-44c8-a11f-58513ae09db2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8vp8h\" (UID: \"048512de-ffeb-44c8-a11f-58513ae09db2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114330 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114347 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgz92\" (UniqueName: \"kubernetes.io/projected/fa021310-c3a2-4feb-93a7-0b2eb6307147-kube-api-access-xgz92\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114361 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-service-ca\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114376 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-trusted-ca-bundle\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114391 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-s9l2c\" (UID: \"2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114407 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f59db107-9767-4161-83f0-09f15ba1d881-registry-certificates\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114422 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f98ebfd5-26fa-49e3-a072-2578e344889b-auth-proxy-config\") pod \"machine-approver-56656f9798-sqlvm\" (UID: \"f98ebfd5-26fa-49e3-a072-2578e344889b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114435 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-config\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114449 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/048512de-ffeb-44c8-a11f-58513ae09db2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8vp8h\" (UID: \"048512de-ffeb-44c8-a11f-58513ae09db2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114500 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70-trusted-ca\") pod \"ingress-operator-5b745b69d9-8clxg\" (UID: \"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114631 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/497dc758-b39a-4915-8eac-e4cb9e2e2a80-proxy-tls\") pod \"machine-config-operator-74547568cd-zkcnc\" (UID: \"497dc758-b39a-4915-8eac-e4cb9e2e2a80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114671 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbc6c\" (UniqueName: \"kubernetes.io/projected/674661a9-9e17-4d57-b887-8294a70fdcad-kube-api-access-nbc6c\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114700 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ed7648e1-d992-4263-9117-e50cd88a66a9-images\") pod \"machine-api-operator-5694c8668f-8lg7n\" (UID: \"ed7648e1-d992-4263-9117-e50cd88a66a9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114719 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114750 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e611b5d-e10c-4148-a5af-b62505d05e74-audit-dir\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114774 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-487lp\" (UniqueName: \"kubernetes.io/projected/f98ebfd5-26fa-49e3-a072-2578e344889b-kube-api-access-487lp\") pod \"machine-approver-56656f9798-sqlvm\" (UID: \"f98ebfd5-26fa-49e3-a072-2578e344889b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114814 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l85l\" (UniqueName: \"kubernetes.io/projected/b2e9bd61-4b86-4b8f-873e-6a143973f249-kube-api-access-2l85l\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114852 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-oauth-config\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114888 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smc5k\" (UniqueName: \"kubernetes.io/projected/470ad2dd-46cd-49e5-ac59-032631bfcb0b-kube-api-access-smc5k\") pod \"openshift-config-operator-7777fb866f-75xtb\" (UID: \"470ad2dd-46cd-49e5-ac59-032631bfcb0b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114911 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b2e9bd61-4b86-4b8f-873e-6a143973f249-etcd-ca\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114930 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f98ebfd5-26fa-49e3-a072-2578e344889b-machine-approver-tls\") pod \"machine-approver-56656f9798-sqlvm\" (UID: \"f98ebfd5-26fa-49e3-a072-2578e344889b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114954 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw52g\" (UniqueName: \"kubernetes.io/projected/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-kube-api-access-dw52g\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.114991 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e611b5d-e10c-4148-a5af-b62505d05e74-audit-policies\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115044 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115128 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5d3717e-bfd0-4f99-9b43-f871cc3c1801-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkxnb\" (UID: \"a5d3717e-bfd0-4f99-9b43-f871cc3c1801\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115225 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-registry-tls\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115319 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f59db107-9767-4161-83f0-09f15ba1d881-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115346 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed7648e1-d992-4263-9117-e50cd88a66a9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8lg7n\" (UID: \"ed7648e1-d992-4263-9117-e50cd88a66a9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" Oct 06 13:06:02 crc kubenswrapper[4867]: E1006 13:06:02.115367 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:02.615350127 +0000 UTC m=+142.073298391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115411 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115441 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/674661a9-9e17-4d57-b887-8294a70fdcad-serving-cert\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115480 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d3717e-bfd0-4f99-9b43-f871cc3c1801-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkxnb\" (UID: \"a5d3717e-bfd0-4f99-9b43-f871cc3c1801\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115509 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-audit-policies\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115534 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f59db107-9767-4161-83f0-09f15ba1d881-trusted-ca\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115714 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8c451a50-f142-4702-91ac-987dc000746b-srv-cert\") pod \"olm-operator-6b444d44fb-h9grd\" (UID: \"8c451a50-f142-4702-91ac-987dc000746b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115749 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115773 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-serving-cert\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115802 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1e611b5d-e10c-4148-a5af-b62505d05e74-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115827 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115885 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed7648e1-d992-4263-9117-e50cd88a66a9-config\") pod \"machine-api-operator-5694c8668f-8lg7n\" (UID: \"ed7648e1-d992-4263-9117-e50cd88a66a9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115909 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-service-ca-bundle\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115932 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/470ad2dd-46cd-49e5-ac59-032631bfcb0b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-75xtb\" (UID: \"470ad2dd-46cd-49e5-ac59-032631bfcb0b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115954 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.115978 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f59db107-9767-4161-83f0-09f15ba1d881-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.116008 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/497dc758-b39a-4915-8eac-e4cb9e2e2a80-images\") pod \"machine-config-operator-74547568cd-zkcnc\" (UID: \"497dc758-b39a-4915-8eac-e4cb9e2e2a80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.116064 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8c451a50-f142-4702-91ac-987dc000746b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h9grd\" (UID: \"8c451a50-f142-4702-91ac-987dc000746b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.116089 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e611b5d-e10c-4148-a5af-b62505d05e74-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.116112 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1e611b5d-e10c-4148-a5af-b62505d05e74-encryption-config\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.116159 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mx9n\" (UniqueName: \"kubernetes.io/projected/d2044b33-c26a-4abb-b304-d7fa7eaaec71-kube-api-access-9mx9n\") pod \"dns-operator-744455d44c-2vpqs\" (UID: \"d2044b33-c26a-4abb-b304-d7fa7eaaec71\") " pod="openshift-dns-operator/dns-operator-744455d44c-2vpqs" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.168721 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pklsp"] Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.202351 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l"] Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219008 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:02 crc kubenswrapper[4867]: E1006 13:06:02.219311 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:02.719294844 +0000 UTC m=+142.177242988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219445 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdldb\" (UniqueName: \"kubernetes.io/projected/8c451a50-f142-4702-91ac-987dc000746b-kube-api-access-zdldb\") pod \"olm-operator-6b444d44fb-h9grd\" (UID: \"8c451a50-f142-4702-91ac-987dc000746b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219463 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e611b5d-e10c-4148-a5af-b62505d05e74-serving-cert\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219489 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219504 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219519 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/048512de-ffeb-44c8-a11f-58513ae09db2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8vp8h\" (UID: \"048512de-ffeb-44c8-a11f-58513ae09db2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219539 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/598298ab-238a-4776-9e23-e66c273dc805-secret-volume\") pod \"collect-profiles-29329260-qlqqc\" (UID: \"598298ab-238a-4776-9e23-e66c273dc805\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219556 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgz92\" (UniqueName: \"kubernetes.io/projected/fa021310-c3a2-4feb-93a7-0b2eb6307147-kube-api-access-xgz92\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219572 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-service-ca\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219586 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/59dff37a-5c33-4cb3-a10d-e73fe741d7e3-signing-key\") pod \"service-ca-9c57cc56f-ggkgn\" (UID: \"59dff37a-5c33-4cb3-a10d-e73fe741d7e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggkgn" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219610 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-trusted-ca-bundle\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219626 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-s9l2c\" (UID: \"2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219642 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/598298ab-238a-4776-9e23-e66c273dc805-config-volume\") pod \"collect-profiles-29329260-qlqqc\" (UID: \"598298ab-238a-4776-9e23-e66c273dc805\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219658 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f59db107-9767-4161-83f0-09f15ba1d881-registry-certificates\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219675 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xj8lg\" (UID: \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219700 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f98ebfd5-26fa-49e3-a072-2578e344889b-auth-proxy-config\") pod \"machine-approver-56656f9798-sqlvm\" (UID: \"f98ebfd5-26fa-49e3-a072-2578e344889b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219715 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-config\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219731 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/048512de-ffeb-44c8-a11f-58513ae09db2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8vp8h\" (UID: \"048512de-ffeb-44c8-a11f-58513ae09db2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219746 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70-trusted-ca\") pod \"ingress-operator-5b745b69d9-8clxg\" (UID: \"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219762 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/497dc758-b39a-4915-8eac-e4cb9e2e2a80-proxy-tls\") pod \"machine-config-operator-74547568cd-zkcnc\" (UID: \"497dc758-b39a-4915-8eac-e4cb9e2e2a80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219777 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48abb7a8-e725-44c3-b95a-25c70999773f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8kf44\" (UID: \"48abb7a8-e725-44c3-b95a-25c70999773f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219796 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pjmd\" (UniqueName: \"kubernetes.io/projected/48abb7a8-e725-44c3-b95a-25c70999773f-kube-api-access-2pjmd\") pod \"kube-storage-version-migrator-operator-b67b599dd-8kf44\" (UID: \"48abb7a8-e725-44c3-b95a-25c70999773f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219812 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219830 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbc6c\" (UniqueName: \"kubernetes.io/projected/674661a9-9e17-4d57-b887-8294a70fdcad-kube-api-access-nbc6c\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219849 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ed7648e1-d992-4263-9117-e50cd88a66a9-images\") pod \"machine-api-operator-5694c8668f-8lg7n\" (UID: \"ed7648e1-d992-4263-9117-e50cd88a66a9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219890 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kpvj\" (UniqueName: \"kubernetes.io/projected/b9f0a917-6145-4144-b761-62695806f129-kube-api-access-5kpvj\") pod \"packageserver-d55dfcdfc-l64rw\" (UID: \"b9f0a917-6145-4144-b761-62695806f129\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219912 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcjkt\" (UniqueName: \"kubernetes.io/projected/4c8dd3b2-db00-42b3-a439-43360407c2bd-kube-api-access-xcjkt\") pod \"ingress-canary-ghwjb\" (UID: \"4c8dd3b2-db00-42b3-a439-43360407c2bd\") " pod="openshift-ingress-canary/ingress-canary-ghwjb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219931 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e611b5d-e10c-4148-a5af-b62505d05e74-audit-dir\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219948 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-487lp\" (UniqueName: \"kubernetes.io/projected/f98ebfd5-26fa-49e3-a072-2578e344889b-kube-api-access-487lp\") pod \"machine-approver-56656f9798-sqlvm\" (UID: \"f98ebfd5-26fa-49e3-a072-2578e344889b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219966 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e4f928da-a8de-4a0b-a256-0002c1b76e81-certs\") pod \"machine-config-server-rff6q\" (UID: \"e4f928da-a8de-4a0b-a256-0002c1b76e81\") " pod="openshift-machine-config-operator/machine-config-server-rff6q" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.219988 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab82a97-184d-4b45-b051-f6fbdb925819-config\") pod \"kube-controller-manager-operator-78b949d7b-zszfp\" (UID: \"aab82a97-184d-4b45-b051-f6fbdb925819\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.220025 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l85l\" (UniqueName: \"kubernetes.io/projected/b2e9bd61-4b86-4b8f-873e-6a143973f249-kube-api-access-2l85l\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.220058 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-oauth-config\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.220080 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b9f0a917-6145-4144-b761-62695806f129-tmpfs\") pod \"packageserver-d55dfcdfc-l64rw\" (UID: \"b9f0a917-6145-4144-b761-62695806f129\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.220099 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9f0a917-6145-4144-b761-62695806f129-apiservice-cert\") pod \"packageserver-d55dfcdfc-l64rw\" (UID: \"b9f0a917-6145-4144-b761-62695806f129\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.220118 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e4f928da-a8de-4a0b-a256-0002c1b76e81-node-bootstrap-token\") pod \"machine-config-server-rff6q\" (UID: \"e4f928da-a8de-4a0b-a256-0002c1b76e81\") " pod="openshift-machine-config-operator/machine-config-server-rff6q" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.220137 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xj8lg\" (UID: \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226129 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aab82a97-184d-4b45-b051-f6fbdb925819-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zszfp\" (UID: \"aab82a97-184d-4b45-b051-f6fbdb925819\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226209 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smc5k\" (UniqueName: \"kubernetes.io/projected/470ad2dd-46cd-49e5-ac59-032631bfcb0b-kube-api-access-smc5k\") pod \"openshift-config-operator-7777fb866f-75xtb\" (UID: \"470ad2dd-46cd-49e5-ac59-032631bfcb0b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226242 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b2e9bd61-4b86-4b8f-873e-6a143973f249-etcd-ca\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226280 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f98ebfd5-26fa-49e3-a072-2578e344889b-machine-approver-tls\") pod \"machine-approver-56656f9798-sqlvm\" (UID: \"f98ebfd5-26fa-49e3-a072-2578e344889b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226302 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/59dff37a-5c33-4cb3-a10d-e73fe741d7e3-signing-cabundle\") pod \"service-ca-9c57cc56f-ggkgn\" (UID: \"59dff37a-5c33-4cb3-a10d-e73fe741d7e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggkgn" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226324 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67691fb1-2774-4035-91b1-ca6c26fa7de6-config\") pod \"service-ca-operator-777779d784-fp56z\" (UID: \"67691fb1-2774-4035-91b1-ca6c26fa7de6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fp56z" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226345 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw52g\" (UniqueName: \"kubernetes.io/projected/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-kube-api-access-dw52g\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226366 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-mountpoint-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226406 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e611b5d-e10c-4148-a5af-b62505d05e74-audit-policies\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226434 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226452 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5d3717e-bfd0-4f99-9b43-f871cc3c1801-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkxnb\" (UID: \"a5d3717e-bfd0-4f99-9b43-f871cc3c1801\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226470 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9b0008a7-3dfa-4e42-be35-3493c341fc69-srv-cert\") pod \"catalog-operator-68c6474976-tcmnq\" (UID: \"9b0008a7-3dfa-4e42-be35-3493c341fc69\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226487 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-registry-tls\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226501 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f59db107-9767-4161-83f0-09f15ba1d881-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226520 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed7648e1-d992-4263-9117-e50cd88a66a9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8lg7n\" (UID: \"ed7648e1-d992-4263-9117-e50cd88a66a9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226541 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226559 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/674661a9-9e17-4d57-b887-8294a70fdcad-serving-cert\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226577 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d3717e-bfd0-4f99-9b43-f871cc3c1801-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkxnb\" (UID: \"a5d3717e-bfd0-4f99-9b43-f871cc3c1801\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226596 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48abb7a8-e725-44c3-b95a-25c70999773f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8kf44\" (UID: \"48abb7a8-e725-44c3-b95a-25c70999773f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226615 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d606431-b258-4baa-bed9-95e82471aa02-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zl588\" (UID: \"1d606431-b258-4baa-bed9-95e82471aa02\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226646 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-audit-policies\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226667 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9f0a917-6145-4144-b761-62695806f129-webhook-cert\") pod \"packageserver-d55dfcdfc-l64rw\" (UID: \"b9f0a917-6145-4144-b761-62695806f129\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226687 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxhhj\" (UniqueName: \"kubernetes.io/projected/b0af59da-ee70-4965-88c1-422936f4156d-kube-api-access-hxhhj\") pod \"dns-default-4k5tp\" (UID: \"b0af59da-ee70-4965-88c1-422936f4156d\") " pod="openshift-dns/dns-default-4k5tp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226709 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c8dd3b2-db00-42b3-a439-43360407c2bd-cert\") pod \"ingress-canary-ghwjb\" (UID: \"4c8dd3b2-db00-42b3-a439-43360407c2bd\") " pod="openshift-ingress-canary/ingress-canary-ghwjb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226733 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f59db107-9767-4161-83f0-09f15ba1d881-trusted-ca\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226775 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8c451a50-f142-4702-91ac-987dc000746b-srv-cert\") pod \"olm-operator-6b444d44fb-h9grd\" (UID: \"8c451a50-f142-4702-91ac-987dc000746b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226798 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsnbj\" (UniqueName: \"kubernetes.io/projected/1204892b-a86d-4b14-9aca-1fcbd64c9cd2-kube-api-access-xsnbj\") pod \"control-plane-machine-set-operator-78cbb6b69f-dclxp\" (UID: \"1204892b-a86d-4b14-9aca-1fcbd64c9cd2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dclxp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226821 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226840 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-serving-cert\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226860 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq6td\" (UniqueName: \"kubernetes.io/projected/e2b1c8fc-889a-4fd7-b34a-72c9c6575b93-kube-api-access-cq6td\") pod \"migrator-59844c95c7-kv4q6\" (UID: \"e2b1c8fc-889a-4fd7-b34a-72c9c6575b93\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kv4q6" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226877 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b5sb\" (UniqueName: \"kubernetes.io/projected/3f1a0563-0bdb-42ae-8754-ef1f299414d8-kube-api-access-6b5sb\") pod \"multus-admission-controller-857f4d67dd-hzns5\" (UID: \"3f1a0563-0bdb-42ae-8754-ef1f299414d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hzns5" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.227287 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e611b5d-e10c-4148-a5af-b62505d05e74-audit-policies\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.222746 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-trusted-ca-bundle\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.222769 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f98ebfd5-26fa-49e3-a072-2578e344889b-auth-proxy-config\") pod \"machine-approver-56656f9798-sqlvm\" (UID: \"f98ebfd5-26fa-49e3-a072-2578e344889b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.224128 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-config\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: E1006 13:06:02.227335 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:02.727317934 +0000 UTC m=+142.185266108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.234619 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1e611b5d-e10c-4148-a5af-b62505d05e74-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.234645 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7v7g\" (UniqueName: \"kubernetes.io/projected/67691fb1-2774-4035-91b1-ca6c26fa7de6-kube-api-access-s7v7g\") pod \"service-ca-operator-777779d784-fp56z\" (UID: \"67691fb1-2774-4035-91b1-ca6c26fa7de6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fp56z" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.234844 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.228520 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-s9l2c\" (UID: \"2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.228525 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-audit-policies\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.228766 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/048512de-ffeb-44c8-a11f-58513ae09db2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8vp8h\" (UID: \"048512de-ffeb-44c8-a11f-58513ae09db2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.222414 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e611b5d-e10c-4148-a5af-b62505d05e74-audit-dir\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.221244 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-service-ca\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.221780 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.230328 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70-trusted-ca\") pod \"ingress-operator-5b745b69d9-8clxg\" (UID: \"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.230542 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f98ebfd5-26fa-49e3-a072-2578e344889b-machine-approver-tls\") pod \"machine-approver-56656f9798-sqlvm\" (UID: \"f98ebfd5-26fa-49e3-a072-2578e344889b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.222553 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ed7648e1-d992-4263-9117-e50cd88a66a9-images\") pod \"machine-api-operator-5694c8668f-8lg7n\" (UID: \"ed7648e1-d992-4263-9117-e50cd88a66a9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.230853 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/497dc758-b39a-4915-8eac-e4cb9e2e2a80-proxy-tls\") pod \"machine-config-operator-74547568cd-zkcnc\" (UID: \"497dc758-b39a-4915-8eac-e4cb9e2e2a80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.230558 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5d3717e-bfd0-4f99-9b43-f871cc3c1801-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkxnb\" (UID: \"a5d3717e-bfd0-4f99-9b43-f871cc3c1801\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.232139 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8c451a50-f142-4702-91ac-987dc000746b-srv-cert\") pod \"olm-operator-6b444d44fb-h9grd\" (UID: \"8c451a50-f142-4702-91ac-987dc000746b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.233638 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f59db107-9767-4161-83f0-09f15ba1d881-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.233845 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-oauth-config\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.234149 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b2e9bd61-4b86-4b8f-873e-6a143973f249-etcd-ca\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.225424 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f59db107-9767-4161-83f0-09f15ba1d881-registry-certificates\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.226363 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.227898 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f59db107-9767-4161-83f0-09f15ba1d881-trusted-ca\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.235778 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e611b5d-e10c-4148-a5af-b62505d05e74-serving-cert\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.235992 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d3717e-bfd0-4f99-9b43-f871cc3c1801-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkxnb\" (UID: \"a5d3717e-bfd0-4f99-9b43-f871cc3c1801\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236027 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1e611b5d-e10c-4148-a5af-b62505d05e74-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236328 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d606431-b258-4baa-bed9-95e82471aa02-proxy-tls\") pod \"machine-config-controller-84d6567774-zl588\" (UID: \"1d606431-b258-4baa-bed9-95e82471aa02\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236398 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236465 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed7648e1-d992-4263-9117-e50cd88a66a9-config\") pod \"machine-api-operator-5694c8668f-8lg7n\" (UID: \"ed7648e1-d992-4263-9117-e50cd88a66a9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236496 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0af59da-ee70-4965-88c1-422936f4156d-metrics-tls\") pod \"dns-default-4k5tp\" (UID: \"b0af59da-ee70-4965-88c1-422936f4156d\") " pod="openshift-dns/dns-default-4k5tp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236521 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-service-ca-bundle\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236543 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/470ad2dd-46cd-49e5-ac59-032631bfcb0b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-75xtb\" (UID: \"470ad2dd-46cd-49e5-ac59-032631bfcb0b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236566 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236589 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-csi-data-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236613 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f59db107-9767-4161-83f0-09f15ba1d881-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236660 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/497dc758-b39a-4915-8eac-e4cb9e2e2a80-images\") pod \"machine-config-operator-74547568cd-zkcnc\" (UID: \"497dc758-b39a-4915-8eac-e4cb9e2e2a80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236682 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-socket-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236720 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8c451a50-f142-4702-91ac-987dc000746b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h9grd\" (UID: \"8c451a50-f142-4702-91ac-987dc000746b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236745 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e611b5d-e10c-4148-a5af-b62505d05e74-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236769 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1e611b5d-e10c-4148-a5af-b62505d05e74-encryption-config\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236796 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mx9n\" (UniqueName: \"kubernetes.io/projected/d2044b33-c26a-4abb-b304-d7fa7eaaec71-kube-api-access-9mx9n\") pod \"dns-operator-744455d44c-2vpqs\" (UID: \"d2044b33-c26a-4abb-b304-d7fa7eaaec71\") " pod="openshift-dns-operator/dns-operator-744455d44c-2vpqs" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236820 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xhxh\" (UniqueName: \"kubernetes.io/projected/e4f928da-a8de-4a0b-a256-0002c1b76e81-kube-api-access-2xhxh\") pod \"machine-config-server-rff6q\" (UID: \"e4f928da-a8de-4a0b-a256-0002c1b76e81\") " pod="openshift-machine-config-operator/machine-config-server-rff6q" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236859 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-default-certificate\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236883 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdkbv\" (UniqueName: \"kubernetes.io/projected/497dc758-b39a-4915-8eac-e4cb9e2e2a80-kube-api-access-gdkbv\") pod \"machine-config-operator-74547568cd-zkcnc\" (UID: \"497dc758-b39a-4915-8eac-e4cb9e2e2a80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236926 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236951 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsrtl\" (UniqueName: \"kubernetes.io/projected/8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70-kube-api-access-qsrtl\") pod \"ingress-operator-5b745b69d9-8clxg\" (UID: \"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.236976 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237002 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-metrics-certs\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237021 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1e611b5d-e10c-4148-a5af-b62505d05e74-etcd-client\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237042 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5m2m\" (UniqueName: \"kubernetes.io/projected/1e611b5d-e10c-4148-a5af-b62505d05e74-kube-api-access-s5m2m\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237063 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-client-ca\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237085 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-serving-cert\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237108 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ztc2\" (UniqueName: \"kubernetes.io/projected/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-kube-api-access-4ztc2\") pod \"marketplace-operator-79b997595-xj8lg\" (UID: \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237146 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pjqs\" (UniqueName: \"kubernetes.io/projected/ed7648e1-d992-4263-9117-e50cd88a66a9-kube-api-access-2pjqs\") pod \"machine-api-operator-5694c8668f-8lg7n\" (UID: \"ed7648e1-d992-4263-9117-e50cd88a66a9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237169 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2044b33-c26a-4abb-b304-d7fa7eaaec71-metrics-tls\") pod \"dns-operator-744455d44c-2vpqs\" (UID: \"d2044b33-c26a-4abb-b304-d7fa7eaaec71\") " pod="openshift-dns-operator/dns-operator-744455d44c-2vpqs" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237193 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2e9bd61-4b86-4b8f-873e-6a143973f249-etcd-service-ca\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237218 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-service-ca-bundle\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237241 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm465\" (UniqueName: \"kubernetes.io/projected/59dff37a-5c33-4cb3-a10d-e73fe741d7e3-kube-api-access-gm465\") pod \"service-ca-9c57cc56f-ggkgn\" (UID: \"59dff37a-5c33-4cb3-a10d-e73fe741d7e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggkgn" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237291 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-config\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237315 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-config\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237337 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/497dc758-b39a-4915-8eac-e4cb9e2e2a80-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zkcnc\" (UID: \"497dc758-b39a-4915-8eac-e4cb9e2e2a80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237361 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1204892b-a86d-4b14-9aca-1fcbd64c9cd2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dclxp\" (UID: \"1204892b-a86d-4b14-9aca-1fcbd64c9cd2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dclxp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237394 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8clxg\" (UID: \"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237423 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3f1a0563-0bdb-42ae-8754-ef1f299414d8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hzns5\" (UID: \"3f1a0563-0bdb-42ae-8754-ef1f299414d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hzns5" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237447 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkpmx\" (UniqueName: \"kubernetes.io/projected/217a87bd-eeb4-49fa-ad25-c8d35ede1637-kube-api-access-tkpmx\") pod \"package-server-manager-789f6589d5-rhkpx\" (UID: \"217a87bd-eeb4-49fa-ad25-c8d35ede1637\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237490 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237511 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-stats-auth\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237561 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqxqb\" (UniqueName: \"kubernetes.io/projected/1d606431-b258-4baa-bed9-95e82471aa02-kube-api-access-gqxqb\") pod \"machine-config-controller-84d6567774-zl588\" (UID: \"1d606431-b258-4baa-bed9-95e82471aa02\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.237582 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0af59da-ee70-4965-88c1-422936f4156d-config-volume\") pod \"dns-default-4k5tp\" (UID: \"b0af59da-ee70-4965-88c1-422936f4156d\") " pod="openshift-dns/dns-default-4k5tp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.239128 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b2e9bd61-4b86-4b8f-873e-6a143973f249-etcd-service-ca\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.239196 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-registry-tls\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.240185 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-client-ca\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.241342 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/470ad2dd-46cd-49e5-ac59-032631bfcb0b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-75xtb\" (UID: \"470ad2dd-46cd-49e5-ac59-032631bfcb0b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.241787 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-service-ca-bundle\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243461 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2044b33-c26a-4abb-b304-d7fa7eaaec71-metrics-tls\") pod \"dns-operator-744455d44c-2vpqs\" (UID: \"d2044b33-c26a-4abb-b304-d7fa7eaaec71\") " pod="openshift-dns-operator/dns-operator-744455d44c-2vpqs" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243501 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f98ebfd5-26fa-49e3-a072-2578e344889b-config\") pod \"machine-approver-56656f9798-sqlvm\" (UID: \"f98ebfd5-26fa-49e3-a072-2578e344889b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243538 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/217a87bd-eeb4-49fa-ad25-c8d35ede1637-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rhkpx\" (UID: \"217a87bd-eeb4-49fa-ad25-c8d35ede1637\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243582 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-plugins-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243603 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc2bc\" (UniqueName: \"kubernetes.io/projected/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-kube-api-access-bc2bc\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243629 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243668 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/470ad2dd-46cd-49e5-ac59-032631bfcb0b-serving-cert\") pod \"openshift-config-operator-7777fb866f-75xtb\" (UID: \"470ad2dd-46cd-49e5-ac59-032631bfcb0b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243688 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/048512de-ffeb-44c8-a11f-58513ae09db2-config\") pod \"kube-apiserver-operator-766d6c64bb-8vp8h\" (UID: \"048512de-ffeb-44c8-a11f-58513ae09db2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243707 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aab82a97-184d-4b45-b051-f6fbdb925819-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zszfp\" (UID: \"aab82a97-184d-4b45-b051-f6fbdb925819\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243747 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fa021310-c3a2-4feb-93a7-0b2eb6307147-audit-dir\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243766 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5d3717e-bfd0-4f99-9b43-f871cc3c1801-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkxnb\" (UID: \"a5d3717e-bfd0-4f99-9b43-f871cc3c1801\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243781 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70-metrics-tls\") pod \"ingress-operator-5b745b69d9-8clxg\" (UID: \"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243820 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9b0008a7-3dfa-4e42-be35-3493c341fc69-profile-collector-cert\") pod \"catalog-operator-68c6474976-tcmnq\" (UID: \"9b0008a7-3dfa-4e42-be35-3493c341fc69\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243934 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-oauth-serving-cert\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.243991 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2e9bd61-4b86-4b8f-873e-6a143973f249-serving-cert\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.244009 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b2e9bd61-4b86-4b8f-873e-6a143973f249-etcd-client\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.244026 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2htdv\" (UniqueName: \"kubernetes.io/projected/2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0-kube-api-access-2htdv\") pod \"cluster-samples-operator-665b6dd947-s9l2c\" (UID: \"2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.244065 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-bound-sa-token\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.244083 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.244101 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7t46\" (UniqueName: \"kubernetes.io/projected/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-kube-api-access-f7t46\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.244139 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkntm\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-kube-api-access-qkntm\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.244158 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e9bd61-4b86-4b8f-873e-6a143973f249-config\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.244174 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dpb2\" (UniqueName: \"kubernetes.io/projected/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-kube-api-access-6dpb2\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.244193 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2n7s\" (UniqueName: \"kubernetes.io/projected/598298ab-238a-4776-9e23-e66c273dc805-kube-api-access-v2n7s\") pod \"collect-profiles-29329260-qlqqc\" (UID: \"598298ab-238a-4776-9e23-e66c273dc805\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.244228 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn4xh\" (UniqueName: \"kubernetes.io/projected/9b0008a7-3dfa-4e42-be35-3493c341fc69-kube-api-access-fn4xh\") pod \"catalog-operator-68c6474976-tcmnq\" (UID: \"9b0008a7-3dfa-4e42-be35-3493c341fc69\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.244285 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.244303 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-registration-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.244319 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67691fb1-2774-4035-91b1-ca6c26fa7de6-serving-cert\") pod \"service-ca-operator-777779d784-fp56z\" (UID: \"67691fb1-2774-4035-91b1-ca6c26fa7de6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fp56z" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.252281 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-stats-auth\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.253017 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-default-certificate\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.241805 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed7648e1-d992-4263-9117-e50cd88a66a9-config\") pod \"machine-api-operator-5694c8668f-8lg7n\" (UID: \"ed7648e1-d992-4263-9117-e50cd88a66a9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.254949 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.255123 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.257617 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-service-ca-bundle\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.258073 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.258443 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f98ebfd5-26fa-49e3-a072-2578e344889b-config\") pod \"machine-approver-56656f9798-sqlvm\" (UID: \"f98ebfd5-26fa-49e3-a072-2578e344889b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.258588 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/674661a9-9e17-4d57-b887-8294a70fdcad-serving-cert\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.259057 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-config\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.259341 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e611b5d-e10c-4148-a5af-b62505d05e74-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.262362 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.262669 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed7648e1-d992-4263-9117-e50cd88a66a9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8lg7n\" (UID: \"ed7648e1-d992-4263-9117-e50cd88a66a9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.263962 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.264007 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-serving-cert\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.264384 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1e611b5d-e10c-4148-a5af-b62505d05e74-etcd-client\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.264805 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/497dc758-b39a-4915-8eac-e4cb9e2e2a80-images\") pod \"machine-config-operator-74547568cd-zkcnc\" (UID: \"497dc758-b39a-4915-8eac-e4cb9e2e2a80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.265582 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/497dc758-b39a-4915-8eac-e4cb9e2e2a80-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zkcnc\" (UID: \"497dc758-b39a-4915-8eac-e4cb9e2e2a80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.265828 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-serving-cert\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.266548 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-oauth-serving-cert\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.266797 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-config\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.266993 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/048512de-ffeb-44c8-a11f-58513ae09db2-config\") pod \"kube-apiserver-operator-766d6c64bb-8vp8h\" (UID: \"048512de-ffeb-44c8-a11f-58513ae09db2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.267023 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1e611b5d-e10c-4148-a5af-b62505d05e74-encryption-config\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.269017 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fa021310-c3a2-4feb-93a7-0b2eb6307147-audit-dir\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.269475 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.270461 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.270507 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e9bd61-4b86-4b8f-873e-6a143973f249-config\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.271954 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f59db107-9767-4161-83f0-09f15ba1d881-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.272530 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.273457 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.277442 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b2e9bd61-4b86-4b8f-873e-6a143973f249-etcd-client\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.281078 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-metrics-certs\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.288055 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.288512 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgz92\" (UniqueName: \"kubernetes.io/projected/fa021310-c3a2-4feb-93a7-0b2eb6307147-kube-api-access-xgz92\") pod \"oauth-openshift-558db77b4-ftjpf\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.288839 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/470ad2dd-46cd-49e5-ac59-032631bfcb0b-serving-cert\") pod \"openshift-config-operator-7777fb866f-75xtb\" (UID: \"470ad2dd-46cd-49e5-ac59-032631bfcb0b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.290407 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8c451a50-f142-4702-91ac-987dc000746b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h9grd\" (UID: \"8c451a50-f142-4702-91ac-987dc000746b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.290861 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70-metrics-tls\") pod \"ingress-operator-5b745b69d9-8clxg\" (UID: \"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.292113 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb"] Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.292742 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2e9bd61-4b86-4b8f-873e-6a143973f249-serving-cert\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.296322 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdldb\" (UniqueName: \"kubernetes.io/projected/8c451a50-f142-4702-91ac-987dc000746b-kube-api-access-zdldb\") pod \"olm-operator-6b444d44fb-h9grd\" (UID: \"8c451a50-f142-4702-91ac-987dc000746b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.297202 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbc6c\" (UniqueName: \"kubernetes.io/projected/674661a9-9e17-4d57-b887-8294a70fdcad-kube-api-access-nbc6c\") pod \"controller-manager-879f6c89f-j646b\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.314129 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-487lp\" (UniqueName: \"kubernetes.io/projected/f98ebfd5-26fa-49e3-a072-2578e344889b-kube-api-access-487lp\") pod \"machine-approver-56656f9798-sqlvm\" (UID: \"f98ebfd5-26fa-49e3-a072-2578e344889b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.315875 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.333565 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l85l\" (UniqueName: \"kubernetes.io/projected/b2e9bd61-4b86-4b8f-873e-6a143973f249-kube-api-access-2l85l\") pod \"etcd-operator-b45778765-87z4s\" (UID: \"b2e9bd61-4b86-4b8f-873e-6a143973f249\") " pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: W1006 13:06:02.340654 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf98ebfd5_26fa_49e3_a072_2578e344889b.slice/crio-4ce6fabb41ef7da3f1d6bb07191b43e8f2b181901276ed40a5c60bb032eb8022 WatchSource:0}: Error finding container 4ce6fabb41ef7da3f1d6bb07191b43e8f2b181901276ed40a5c60bb032eb8022: Status 404 returned error can't find the container with id 4ce6fabb41ef7da3f1d6bb07191b43e8f2b181901276ed40a5c60bb032eb8022 Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.344882 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345008 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b9f0a917-6145-4144-b761-62695806f129-tmpfs\") pod \"packageserver-d55dfcdfc-l64rw\" (UID: \"b9f0a917-6145-4144-b761-62695806f129\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345028 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9f0a917-6145-4144-b761-62695806f129-apiservice-cert\") pod \"packageserver-d55dfcdfc-l64rw\" (UID: \"b9f0a917-6145-4144-b761-62695806f129\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345046 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e4f928da-a8de-4a0b-a256-0002c1b76e81-node-bootstrap-token\") pod \"machine-config-server-rff6q\" (UID: \"e4f928da-a8de-4a0b-a256-0002c1b76e81\") " pod="openshift-machine-config-operator/machine-config-server-rff6q" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345062 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xj8lg\" (UID: \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345081 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aab82a97-184d-4b45-b051-f6fbdb925819-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zszfp\" (UID: \"aab82a97-184d-4b45-b051-f6fbdb925819\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345096 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67691fb1-2774-4035-91b1-ca6c26fa7de6-config\") pod \"service-ca-operator-777779d784-fp56z\" (UID: \"67691fb1-2774-4035-91b1-ca6c26fa7de6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fp56z" Oct 06 13:06:02 crc kubenswrapper[4867]: E1006 13:06:02.345134 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:02.845109654 +0000 UTC m=+142.303057848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345190 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/59dff37a-5c33-4cb3-a10d-e73fe741d7e3-signing-cabundle\") pod \"service-ca-9c57cc56f-ggkgn\" (UID: \"59dff37a-5c33-4cb3-a10d-e73fe741d7e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggkgn" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345238 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-mountpoint-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345304 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345331 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9b0008a7-3dfa-4e42-be35-3493c341fc69-srv-cert\") pod \"catalog-operator-68c6474976-tcmnq\" (UID: \"9b0008a7-3dfa-4e42-be35-3493c341fc69\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345362 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48abb7a8-e725-44c3-b95a-25c70999773f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8kf44\" (UID: \"48abb7a8-e725-44c3-b95a-25c70999773f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345387 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d606431-b258-4baa-bed9-95e82471aa02-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zl588\" (UID: \"1d606431-b258-4baa-bed9-95e82471aa02\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345414 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxhhj\" (UniqueName: \"kubernetes.io/projected/b0af59da-ee70-4965-88c1-422936f4156d-kube-api-access-hxhhj\") pod \"dns-default-4k5tp\" (UID: \"b0af59da-ee70-4965-88c1-422936f4156d\") " pod="openshift-dns/dns-default-4k5tp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345439 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9f0a917-6145-4144-b761-62695806f129-webhook-cert\") pod \"packageserver-d55dfcdfc-l64rw\" (UID: \"b9f0a917-6145-4144-b761-62695806f129\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345462 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c8dd3b2-db00-42b3-a439-43360407c2bd-cert\") pod \"ingress-canary-ghwjb\" (UID: \"4c8dd3b2-db00-42b3-a439-43360407c2bd\") " pod="openshift-ingress-canary/ingress-canary-ghwjb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345490 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsnbj\" (UniqueName: \"kubernetes.io/projected/1204892b-a86d-4b14-9aca-1fcbd64c9cd2-kube-api-access-xsnbj\") pod \"control-plane-machine-set-operator-78cbb6b69f-dclxp\" (UID: \"1204892b-a86d-4b14-9aca-1fcbd64c9cd2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dclxp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345515 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq6td\" (UniqueName: \"kubernetes.io/projected/e2b1c8fc-889a-4fd7-b34a-72c9c6575b93-kube-api-access-cq6td\") pod \"migrator-59844c95c7-kv4q6\" (UID: \"e2b1c8fc-889a-4fd7-b34a-72c9c6575b93\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kv4q6" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345542 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b5sb\" (UniqueName: \"kubernetes.io/projected/3f1a0563-0bdb-42ae-8754-ef1f299414d8-kube-api-access-6b5sb\") pod \"multus-admission-controller-857f4d67dd-hzns5\" (UID: \"3f1a0563-0bdb-42ae-8754-ef1f299414d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hzns5" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345568 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7v7g\" (UniqueName: \"kubernetes.io/projected/67691fb1-2774-4035-91b1-ca6c26fa7de6-kube-api-access-s7v7g\") pod \"service-ca-operator-777779d784-fp56z\" (UID: \"67691fb1-2774-4035-91b1-ca6c26fa7de6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fp56z" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345591 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d606431-b258-4baa-bed9-95e82471aa02-proxy-tls\") pod \"machine-config-controller-84d6567774-zl588\" (UID: \"1d606431-b258-4baa-bed9-95e82471aa02\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345617 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0af59da-ee70-4965-88c1-422936f4156d-metrics-tls\") pod \"dns-default-4k5tp\" (UID: \"b0af59da-ee70-4965-88c1-422936f4156d\") " pod="openshift-dns/dns-default-4k5tp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345657 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-csi-data-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345686 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-socket-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345698 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67691fb1-2774-4035-91b1-ca6c26fa7de6-config\") pod \"service-ca-operator-777779d784-fp56z\" (UID: \"67691fb1-2774-4035-91b1-ca6c26fa7de6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fp56z" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345716 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xhxh\" (UniqueName: \"kubernetes.io/projected/e4f928da-a8de-4a0b-a256-0002c1b76e81-kube-api-access-2xhxh\") pod \"machine-config-server-rff6q\" (UID: \"e4f928da-a8de-4a0b-a256-0002c1b76e81\") " pod="openshift-machine-config-operator/machine-config-server-rff6q" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345786 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ztc2\" (UniqueName: \"kubernetes.io/projected/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-kube-api-access-4ztc2\") pod \"marketplace-operator-79b997595-xj8lg\" (UID: \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345837 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm465\" (UniqueName: \"kubernetes.io/projected/59dff37a-5c33-4cb3-a10d-e73fe741d7e3-kube-api-access-gm465\") pod \"service-ca-9c57cc56f-ggkgn\" (UID: \"59dff37a-5c33-4cb3-a10d-e73fe741d7e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggkgn" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345864 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1204892b-a86d-4b14-9aca-1fcbd64c9cd2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dclxp\" (UID: \"1204892b-a86d-4b14-9aca-1fcbd64c9cd2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dclxp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345913 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkpmx\" (UniqueName: \"kubernetes.io/projected/217a87bd-eeb4-49fa-ad25-c8d35ede1637-kube-api-access-tkpmx\") pod \"package-server-manager-789f6589d5-rhkpx\" (UID: \"217a87bd-eeb4-49fa-ad25-c8d35ede1637\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345939 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3f1a0563-0bdb-42ae-8754-ef1f299414d8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hzns5\" (UID: \"3f1a0563-0bdb-42ae-8754-ef1f299414d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hzns5" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345967 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0af59da-ee70-4965-88c1-422936f4156d-config-volume\") pod \"dns-default-4k5tp\" (UID: \"b0af59da-ee70-4965-88c1-422936f4156d\") " pod="openshift-dns/dns-default-4k5tp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.345992 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqxqb\" (UniqueName: \"kubernetes.io/projected/1d606431-b258-4baa-bed9-95e82471aa02-kube-api-access-gqxqb\") pod \"machine-config-controller-84d6567774-zl588\" (UID: \"1d606431-b258-4baa-bed9-95e82471aa02\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346018 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc2bc\" (UniqueName: \"kubernetes.io/projected/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-kube-api-access-bc2bc\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346045 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/217a87bd-eeb4-49fa-ad25-c8d35ede1637-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rhkpx\" (UID: \"217a87bd-eeb4-49fa-ad25-c8d35ede1637\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346067 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-plugins-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346090 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aab82a97-184d-4b45-b051-f6fbdb925819-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zszfp\" (UID: \"aab82a97-184d-4b45-b051-f6fbdb925819\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346124 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9b0008a7-3dfa-4e42-be35-3493c341fc69-profile-collector-cert\") pod \"catalog-operator-68c6474976-tcmnq\" (UID: \"9b0008a7-3dfa-4e42-be35-3493c341fc69\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346192 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2n7s\" (UniqueName: \"kubernetes.io/projected/598298ab-238a-4776-9e23-e66c273dc805-kube-api-access-v2n7s\") pod \"collect-profiles-29329260-qlqqc\" (UID: \"598298ab-238a-4776-9e23-e66c273dc805\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346217 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn4xh\" (UniqueName: \"kubernetes.io/projected/9b0008a7-3dfa-4e42-be35-3493c341fc69-kube-api-access-fn4xh\") pod \"catalog-operator-68c6474976-tcmnq\" (UID: \"9b0008a7-3dfa-4e42-be35-3493c341fc69\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346271 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-registration-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346297 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67691fb1-2774-4035-91b1-ca6c26fa7de6-serving-cert\") pod \"service-ca-operator-777779d784-fp56z\" (UID: \"67691fb1-2774-4035-91b1-ca6c26fa7de6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fp56z" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346329 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/598298ab-238a-4776-9e23-e66c273dc805-secret-volume\") pod \"collect-profiles-29329260-qlqqc\" (UID: \"598298ab-238a-4776-9e23-e66c273dc805\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346354 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/59dff37a-5c33-4cb3-a10d-e73fe741d7e3-signing-key\") pod \"service-ca-9c57cc56f-ggkgn\" (UID: \"59dff37a-5c33-4cb3-a10d-e73fe741d7e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggkgn" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346378 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/598298ab-238a-4776-9e23-e66c273dc805-config-volume\") pod \"collect-profiles-29329260-qlqqc\" (UID: \"598298ab-238a-4776-9e23-e66c273dc805\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346407 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xj8lg\" (UID: \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346435 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48abb7a8-e725-44c3-b95a-25c70999773f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8kf44\" (UID: \"48abb7a8-e725-44c3-b95a-25c70999773f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346462 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pjmd\" (UniqueName: \"kubernetes.io/projected/48abb7a8-e725-44c3-b95a-25c70999773f-kube-api-access-2pjmd\") pod \"kube-storage-version-migrator-operator-b67b599dd-8kf44\" (UID: \"48abb7a8-e725-44c3-b95a-25c70999773f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346498 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcjkt\" (UniqueName: \"kubernetes.io/projected/4c8dd3b2-db00-42b3-a439-43360407c2bd-kube-api-access-xcjkt\") pod \"ingress-canary-ghwjb\" (UID: \"4c8dd3b2-db00-42b3-a439-43360407c2bd\") " pod="openshift-ingress-canary/ingress-canary-ghwjb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346525 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kpvj\" (UniqueName: \"kubernetes.io/projected/b9f0a917-6145-4144-b761-62695806f129-kube-api-access-5kpvj\") pod \"packageserver-d55dfcdfc-l64rw\" (UID: \"b9f0a917-6145-4144-b761-62695806f129\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346548 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e4f928da-a8de-4a0b-a256-0002c1b76e81-certs\") pod \"machine-config-server-rff6q\" (UID: \"e4f928da-a8de-4a0b-a256-0002c1b76e81\") " pod="openshift-machine-config-operator/machine-config-server-rff6q" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.346572 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab82a97-184d-4b45-b051-f6fbdb925819-config\") pod \"kube-controller-manager-operator-78b949d7b-zszfp\" (UID: \"aab82a97-184d-4b45-b051-f6fbdb925819\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.347347 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/59dff37a-5c33-4cb3-a10d-e73fe741d7e3-signing-cabundle\") pod \"service-ca-9c57cc56f-ggkgn\" (UID: \"59dff37a-5c33-4cb3-a10d-e73fe741d7e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggkgn" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.347671 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab82a97-184d-4b45-b051-f6fbdb925819-config\") pod \"kube-controller-manager-operator-78b949d7b-zszfp\" (UID: \"aab82a97-184d-4b45-b051-f6fbdb925819\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.347947 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-registration-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.348597 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1d606431-b258-4baa-bed9-95e82471aa02-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zl588\" (UID: \"1d606431-b258-4baa-bed9-95e82471aa02\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.349609 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b9f0a917-6145-4144-b761-62695806f129-tmpfs\") pod \"packageserver-d55dfcdfc-l64rw\" (UID: \"b9f0a917-6145-4144-b761-62695806f129\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.351627 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-socket-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.352055 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-csi-data-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.352508 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48abb7a8-e725-44c3-b95a-25c70999773f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8kf44\" (UID: \"48abb7a8-e725-44c3-b95a-25c70999773f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.352640 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-mountpoint-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.352642 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-plugins-dir\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: E1006 13:06:02.352957 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:02.852941069 +0000 UTC m=+142.310889303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.353723 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0af59da-ee70-4965-88c1-422936f4156d-config-volume\") pod \"dns-default-4k5tp\" (UID: \"b0af59da-ee70-4965-88c1-422936f4156d\") " pod="openshift-dns/dns-default-4k5tp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.353926 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48abb7a8-e725-44c3-b95a-25c70999773f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8kf44\" (UID: \"48abb7a8-e725-44c3-b95a-25c70999773f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.354155 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e4f928da-a8de-4a0b-a256-0002c1b76e81-node-bootstrap-token\") pod \"machine-config-server-rff6q\" (UID: \"e4f928da-a8de-4a0b-a256-0002c1b76e81\") " pod="openshift-machine-config-operator/machine-config-server-rff6q" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.355295 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xj8lg\" (UID: \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.355514 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/598298ab-238a-4776-9e23-e66c273dc805-config-volume\") pod \"collect-profiles-29329260-qlqqc\" (UID: \"598298ab-238a-4776-9e23-e66c273dc805\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.357199 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/598298ab-238a-4776-9e23-e66c273dc805-secret-volume\") pod \"collect-profiles-29329260-qlqqc\" (UID: \"598298ab-238a-4776-9e23-e66c273dc805\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.357581 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9b0008a7-3dfa-4e42-be35-3493c341fc69-srv-cert\") pod \"catalog-operator-68c6474976-tcmnq\" (UID: \"9b0008a7-3dfa-4e42-be35-3493c341fc69\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.357698 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9b0008a7-3dfa-4e42-be35-3493c341fc69-profile-collector-cert\") pod \"catalog-operator-68c6474976-tcmnq\" (UID: \"9b0008a7-3dfa-4e42-be35-3493c341fc69\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.357772 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1204892b-a86d-4b14-9aca-1fcbd64c9cd2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dclxp\" (UID: \"1204892b-a86d-4b14-9aca-1fcbd64c9cd2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dclxp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.358021 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d606431-b258-4baa-bed9-95e82471aa02-proxy-tls\") pod \"machine-config-controller-84d6567774-zl588\" (UID: \"1d606431-b258-4baa-bed9-95e82471aa02\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.358188 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9f0a917-6145-4144-b761-62695806f129-webhook-cert\") pod \"packageserver-d55dfcdfc-l64rw\" (UID: \"b9f0a917-6145-4144-b761-62695806f129\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.358402 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aab82a97-184d-4b45-b051-f6fbdb925819-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zszfp\" (UID: \"aab82a97-184d-4b45-b051-f6fbdb925819\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.358710 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c8dd3b2-db00-42b3-a439-43360407c2bd-cert\") pod \"ingress-canary-ghwjb\" (UID: \"4c8dd3b2-db00-42b3-a439-43360407c2bd\") " pod="openshift-ingress-canary/ingress-canary-ghwjb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.358715 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xj8lg\" (UID: \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.358807 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/048512de-ffeb-44c8-a11f-58513ae09db2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8vp8h\" (UID: \"048512de-ffeb-44c8-a11f-58513ae09db2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.358810 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0af59da-ee70-4965-88c1-422936f4156d-metrics-tls\") pod \"dns-default-4k5tp\" (UID: \"b0af59da-ee70-4965-88c1-422936f4156d\") " pod="openshift-dns/dns-default-4k5tp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.360216 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67691fb1-2774-4035-91b1-ca6c26fa7de6-serving-cert\") pod \"service-ca-operator-777779d784-fp56z\" (UID: \"67691fb1-2774-4035-91b1-ca6c26fa7de6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fp56z" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.361234 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3f1a0563-0bdb-42ae-8754-ef1f299414d8-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hzns5\" (UID: \"3f1a0563-0bdb-42ae-8754-ef1f299414d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hzns5" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.361343 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9f0a917-6145-4144-b761-62695806f129-apiservice-cert\") pod \"packageserver-d55dfcdfc-l64rw\" (UID: \"b9f0a917-6145-4144-b761-62695806f129\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.361581 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/59dff37a-5c33-4cb3-a10d-e73fe741d7e3-signing-key\") pod \"service-ca-9c57cc56f-ggkgn\" (UID: \"59dff37a-5c33-4cb3-a10d-e73fe741d7e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggkgn" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.361743 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e4f928da-a8de-4a0b-a256-0002c1b76e81-certs\") pod \"machine-config-server-rff6q\" (UID: \"e4f928da-a8de-4a0b-a256-0002c1b76e81\") " pod="openshift-machine-config-operator/machine-config-server-rff6q" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.362157 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/217a87bd-eeb4-49fa-ad25-c8d35ede1637-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rhkpx\" (UID: \"217a87bd-eeb4-49fa-ad25-c8d35ede1637\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.373023 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw52g\" (UniqueName: \"kubernetes.io/projected/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-kube-api-access-dw52g\") pod \"console-f9d7485db-rqnc4\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.394884 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smc5k\" (UniqueName: \"kubernetes.io/projected/470ad2dd-46cd-49e5-ac59-032631bfcb0b-kube-api-access-smc5k\") pod \"openshift-config-operator-7777fb866f-75xtb\" (UID: \"470ad2dd-46cd-49e5-ac59-032631bfcb0b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.437995 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mx9n\" (UniqueName: \"kubernetes.io/projected/d2044b33-c26a-4abb-b304-d7fa7eaaec71-kube-api-access-9mx9n\") pod \"dns-operator-744455d44c-2vpqs\" (UID: \"d2044b33-c26a-4abb-b304-d7fa7eaaec71\") " pod="openshift-dns-operator/dns-operator-744455d44c-2vpqs" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.441386 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.448164 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:02 crc kubenswrapper[4867]: E1006 13:06:02.448351 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:02.948319781 +0000 UTC m=+142.406267925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.448590 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: E1006 13:06:02.448951 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:02.948938677 +0000 UTC m=+142.406886821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.454598 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsrtl\" (UniqueName: \"kubernetes.io/projected/8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70-kube-api-access-qsrtl\") pod \"ingress-operator-5b745b69d9-8clxg\" (UID: \"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.457550 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.473677 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.475892 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdkbv\" (UniqueName: \"kubernetes.io/projected/497dc758-b39a-4915-8eac-e4cb9e2e2a80-kube-api-access-gdkbv\") pod \"machine-config-operator-74547568cd-zkcnc\" (UID: \"497dc758-b39a-4915-8eac-e4cb9e2e2a80\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.494345 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.495588 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5m2m\" (UniqueName: \"kubernetes.io/projected/1e611b5d-e10c-4148-a5af-b62505d05e74-kube-api-access-s5m2m\") pod \"apiserver-7bbb656c7d-nplc7\" (UID: \"1e611b5d-e10c-4148-a5af-b62505d05e74\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.501888 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.517556 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8clxg\" (UID: \"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.537896 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.539443 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pjqs\" (UniqueName: \"kubernetes.io/projected/ed7648e1-d992-4263-9117-e50cd88a66a9-kube-api-access-2pjqs\") pod \"machine-api-operator-5694c8668f-8lg7n\" (UID: \"ed7648e1-d992-4263-9117-e50cd88a66a9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.542973 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc"] Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.543962 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vf59b"] Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.544478 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.553197 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:02 crc kubenswrapper[4867]: E1006 13:06:02.553595 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:03.053364006 +0000 UTC m=+142.511312160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.553861 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: E1006 13:06:02.554499 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:03.054489506 +0000 UTC m=+142.512437650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:02 crc kubenswrapper[4867]: W1006 13:06:02.558499 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b0dc85d_57e3_49f7_89b3_a89daad03f39.slice/crio-aa33131e34aa2512a260bd79c2088484c7315e2a251ef776e3c1f74b20aa9657 WatchSource:0}: Error finding container aa33131e34aa2512a260bd79c2088484c7315e2a251ef776e3c1f74b20aa9657: Status 404 returned error can't find the container with id aa33131e34aa2512a260bd79c2088484c7315e2a251ef776e3c1f74b20aa9657 Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.562948 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.563857 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7t46\" (UniqueName: \"kubernetes.io/projected/dc7ecd08-018a-482c-be65-b08bdcbf2ed6-kube-api-access-f7t46\") pod \"router-default-5444994796-svjt7\" (UID: \"dc7ecd08-018a-482c-be65-b08bdcbf2ed6\") " pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.580595 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkntm\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-kube-api-access-qkntm\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.595163 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dpb2\" (UniqueName: \"kubernetes.io/projected/fe98621b-3abf-47ac-a3f8-0abc2c371dcf-kube-api-access-6dpb2\") pod \"authentication-operator-69f744f599-bf8xq\" (UID: \"fe98621b-3abf-47ac-a3f8-0abc2c371dcf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.617114 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-bound-sa-token\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.632978 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.634909 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5d3717e-bfd0-4f99-9b43-f871cc3c1801-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkxnb\" (UID: \"a5d3717e-bfd0-4f99-9b43-f871cc3c1801\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.644466 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.649559 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h"] Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.657715 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2htdv\" (UniqueName: \"kubernetes.io/projected/2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0-kube-api-access-2htdv\") pod \"cluster-samples-operator-665b6dd947-s9l2c\" (UID: \"2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.658680 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:02 crc kubenswrapper[4867]: E1006 13:06:02.658883 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:03.158855393 +0000 UTC m=+142.616803547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.658983 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: E1006 13:06:02.659706 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:03.159693975 +0000 UTC m=+142.617642329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.663300 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.698267 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm465\" (UniqueName: \"kubernetes.io/projected/59dff37a-5c33-4cb3-a10d-e73fe741d7e3-kube-api-access-gm465\") pod \"service-ca-9c57cc56f-ggkgn\" (UID: \"59dff37a-5c33-4cb3-a10d-e73fe741d7e3\") " pod="openshift-service-ca/service-ca-9c57cc56f-ggkgn" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.701483 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2vpqs" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.704390 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsnbj\" (UniqueName: \"kubernetes.io/projected/1204892b-a86d-4b14-9aca-1fcbd64c9cd2-kube-api-access-xsnbj\") pod \"control-plane-machine-set-operator-78cbb6b69f-dclxp\" (UID: \"1204892b-a86d-4b14-9aca-1fcbd64c9cd2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dclxp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.718184 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq6td\" (UniqueName: \"kubernetes.io/projected/e2b1c8fc-889a-4fd7-b34a-72c9c6575b93-kube-api-access-cq6td\") pod \"migrator-59844c95c7-kv4q6\" (UID: \"e2b1c8fc-889a-4fd7-b34a-72c9c6575b93\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kv4q6" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.743223 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b5sb\" (UniqueName: \"kubernetes.io/projected/3f1a0563-0bdb-42ae-8754-ef1f299414d8-kube-api-access-6b5sb\") pod \"multus-admission-controller-857f4d67dd-hzns5\" (UID: \"3f1a0563-0bdb-42ae-8754-ef1f299414d8\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hzns5" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.756641 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7v7g\" (UniqueName: \"kubernetes.io/projected/67691fb1-2774-4035-91b1-ca6c26fa7de6-kube-api-access-s7v7g\") pod \"service-ca-operator-777779d784-fp56z\" (UID: \"67691fb1-2774-4035-91b1-ca6c26fa7de6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fp56z" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.760419 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:02 crc kubenswrapper[4867]: E1006 13:06:02.760773 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:03.260757916 +0000 UTC m=+142.718706060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.790665 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aab82a97-184d-4b45-b051-f6fbdb925819-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zszfp\" (UID: \"aab82a97-184d-4b45-b051-f6fbdb925819\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.795044 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn4xh\" (UniqueName: \"kubernetes.io/projected/9b0008a7-3dfa-4e42-be35-3493c341fc69-kube-api-access-fn4xh\") pod \"catalog-operator-68c6474976-tcmnq\" (UID: \"9b0008a7-3dfa-4e42-be35-3493c341fc69\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.830100 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxhhj\" (UniqueName: \"kubernetes.io/projected/b0af59da-ee70-4965-88c1-422936f4156d-kube-api-access-hxhhj\") pod \"dns-default-4k5tp\" (UID: \"b0af59da-ee70-4965-88c1-422936f4156d\") " pod="openshift-dns/dns-default-4k5tp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.833783 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.849666 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.853725 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkpmx\" (UniqueName: \"kubernetes.io/projected/217a87bd-eeb4-49fa-ad25-c8d35ede1637-kube-api-access-tkpmx\") pod \"package-server-manager-789f6589d5-rhkpx\" (UID: \"217a87bd-eeb4-49fa-ad25-c8d35ede1637\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.854074 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.856233 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hzns5" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.858077 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqxqb\" (UniqueName: \"kubernetes.io/projected/1d606431-b258-4baa-bed9-95e82471aa02-kube-api-access-gqxqb\") pod \"machine-config-controller-84d6567774-zl588\" (UID: \"1d606431-b258-4baa-bed9-95e82471aa02\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.862929 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:02 crc kubenswrapper[4867]: E1006 13:06:02.863433 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:03.363396968 +0000 UTC m=+142.821345112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.872108 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.876466 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.878125 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xhxh\" (UniqueName: \"kubernetes.io/projected/e4f928da-a8de-4a0b-a256-0002c1b76e81-kube-api-access-2xhxh\") pod \"machine-config-server-rff6q\" (UID: \"e4f928da-a8de-4a0b-a256-0002c1b76e81\") " pod="openshift-machine-config-operator/machine-config-server-rff6q" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.879536 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb" event={"ID":"d4a6a1dc-585a-43b9-afbe-d5054a71e70e","Type":"ContainerStarted","Data":"dd846c2ce779044e626fcff3bec0bc2d6b54a9453d907f1be41d5904953fc518"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.879581 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb" event={"ID":"d4a6a1dc-585a-43b9-afbe-d5054a71e70e","Type":"ContainerStarted","Data":"8cedaaaac99d83fff7b3a6e1ad46bd14732b84c8a4949ee42782cead25d7e0de"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.880470 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.882874 4867 generic.go:334] "Generic (PLEG): container finished" podID="21b2c8cf-2109-4663-bdb0-b106d2a4c548" containerID="4405c31cfb165d8eed86abddb5fae3f728b86aa1057e695f2aa00d03a40f08c7" exitCode=0 Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.885855 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pklsp" event={"ID":"21b2c8cf-2109-4663-bdb0-b106d2a4c548","Type":"ContainerDied","Data":"4405c31cfb165d8eed86abddb5fae3f728b86aa1057e695f2aa00d03a40f08c7"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.885988 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pklsp" event={"ID":"21b2c8cf-2109-4663-bdb0-b106d2a4c548","Type":"ContainerStarted","Data":"b45a939e396ce93f7ae86e1219d368a02d0499f544abf61fd554a817930ce97a"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.887486 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dclxp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.906002 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" event={"ID":"f98ebfd5-26fa-49e3-a072-2578e344889b","Type":"ContainerStarted","Data":"013518db86a2dc02b6abe3cff8a2fff34c31adf673e3393e1b865582cc1d40da"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.906071 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" event={"ID":"f98ebfd5-26fa-49e3-a072-2578e344889b","Type":"ContainerStarted","Data":"4ce6fabb41ef7da3f1d6bb07191b43e8f2b181901276ed40a5c60bb032eb8022"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.918008 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ggkgn" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.921276 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg"] Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.926117 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kv4q6" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.926523 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vf59b" event={"ID":"ac64bd06-d021-43c4-8f20-bff80a406d77","Type":"ContainerStarted","Data":"43c94e783df90fb075ce725fcb7811b24ea13eb7e238179a0b5d8776783cf313"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.926562 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vf59b" event={"ID":"ac64bd06-d021-43c4-8f20-bff80a406d77","Type":"ContainerStarted","Data":"fac7b3f9af53f7deec77f9729e63369e4bfc427d509f351b924bd86b94e61ce0"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.927652 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ztc2\" (UniqueName: \"kubernetes.io/projected/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-kube-api-access-4ztc2\") pod \"marketplace-operator-79b997595-xj8lg\" (UID: \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.928099 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.929607 4867 patch_prober.go:28] interesting pod/console-operator-58897d9998-vf59b container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.929653 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vf59b" podUID="ac64bd06-d021-43c4-8f20-bff80a406d77" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.932053 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" event={"ID":"4b0dc85d-57e3-49f7-89b3-a89daad03f39","Type":"ContainerStarted","Data":"6e619877c6e4da893c884fa2983e968f831972f433e2167c360d4ea321833421"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.932089 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" event={"ID":"4b0dc85d-57e3-49f7-89b3-a89daad03f39","Type":"ContainerStarted","Data":"aa33131e34aa2512a260bd79c2088484c7315e2a251ef776e3c1f74b20aa9657"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.934555 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.935438 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc2bc\" (UniqueName: \"kubernetes.io/projected/5b5edafd-513a-4a3a-bbc8-b027d6afda2c-kube-api-access-bc2bc\") pod \"csi-hostpathplugin-zpvzr\" (UID: \"5b5edafd-513a-4a3a-bbc8-b027d6afda2c\") " pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.935499 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt" event={"ID":"a370043f-0cb1-4279-8ebd-d3b15ef2980a","Type":"ContainerStarted","Data":"5153fb15bb02425242a05f293b7db9d753e39fb74c2a9529ebddbbc341b5782b"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.935545 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt" event={"ID":"a370043f-0cb1-4279-8ebd-d3b15ef2980a","Type":"ContainerStarted","Data":"7b9ffa0c5bbd3499d5687b90ea1baf57272f7986ccaf8d849dbaa52fc9f08d70"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.936856 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h" event={"ID":"048512de-ffeb-44c8-a11f-58513ae09db2","Type":"ContainerStarted","Data":"002b4b4d185a2e9b7e466f0a96bfd2dc1640f55c77987fc96fa557c3bb79d0e1"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.942583 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fp56z" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.947689 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x4d8v" event={"ID":"8826a928-e7d1-4cb1-bd08-69849ee5a12b","Type":"ContainerStarted","Data":"ac4877164596551fe32d7abfd6f5d052db5cdd0653227c99f3b843f49d8469fd"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.947740 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-x4d8v" event={"ID":"8826a928-e7d1-4cb1-bd08-69849ee5a12b","Type":"ContainerStarted","Data":"a81fef00c59b559a5d71f4fce83f68fb05e879443e358530a1eef8a644f36282"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.949126 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-x4d8v" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.949945 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2n7s\" (UniqueName: \"kubernetes.io/projected/598298ab-238a-4776-9e23-e66c273dc805-kube-api-access-v2n7s\") pod \"collect-profiles-29329260-qlqqc\" (UID: \"598298ab-238a-4776-9e23-e66c273dc805\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.950567 4867 patch_prober.go:28] interesting pod/downloads-7954f5f757-x4d8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.950615 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x4d8v" podUID="8826a928-e7d1-4cb1-bd08-69849ee5a12b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.951530 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" event={"ID":"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca","Type":"ContainerStarted","Data":"f23823ef72bc471ab9f68c0ba8f8a908d55659711ccb2cca38713ef9c8d50c99"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.951560 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" event={"ID":"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca","Type":"ContainerStarted","Data":"88b79c911c935f4f988a317f8104812bebf557fd2047b5d70345c24dba53d3c3"} Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.954146 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.955207 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.957972 4867 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-m6n6l container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.958021 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" podUID="0bd3770c-b6a9-42a3-9530-30ef3b90c7ca" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.965017 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcjkt\" (UniqueName: \"kubernetes.io/projected/4c8dd3b2-db00-42b3-a439-43360407c2bd-kube-api-access-xcjkt\") pod \"ingress-canary-ghwjb\" (UID: \"4c8dd3b2-db00-42b3-a439-43360407c2bd\") " pod="openshift-ingress-canary/ingress-canary-ghwjb" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.965399 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rff6q" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.969431 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:02 crc kubenswrapper[4867]: E1006 13:06:02.971172 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:03.471155815 +0000 UTC m=+142.929103959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.972707 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4k5tp" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.975987 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pjmd\" (UniqueName: \"kubernetes.io/projected/48abb7a8-e725-44c3-b95a-25c70999773f-kube-api-access-2pjmd\") pod \"kube-storage-version-migrator-operator-b67b599dd-8kf44\" (UID: \"48abb7a8-e725-44c3-b95a-25c70999773f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44" Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.983216 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j646b"] Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.985768 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ftjpf"] Oct 06 13:06:02 crc kubenswrapper[4867]: I1006 13:06:02.998638 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.002619 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ghwjb" Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.002792 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kpvj\" (UniqueName: \"kubernetes.io/projected/b9f0a917-6145-4144-b761-62695806f129-kube-api-access-5kpvj\") pod \"packageserver-d55dfcdfc-l64rw\" (UID: \"b9f0a917-6145-4144-b761-62695806f129\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.078587 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:03 crc kubenswrapper[4867]: E1006 13:06:03.081203 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:03.581189181 +0000 UTC m=+143.039137325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.087427 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc"] Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.097277 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-87z4s"] Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.180012 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:03 crc kubenswrapper[4867]: E1006 13:06:03.180763 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:03.680747632 +0000 UTC m=+143.138695776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.188296 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd"] Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.193672 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44" Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.198611 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rqnc4"] Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.201852 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.209738 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.249005 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.282123 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:03 crc kubenswrapper[4867]: E1006 13:06:03.282456 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:03.78244159 +0000 UTC m=+143.240389734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.379934 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8lg7n"] Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.382826 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:03 crc kubenswrapper[4867]: E1006 13:06:03.383224 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:03.883207713 +0000 UTC m=+143.341155857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.389919 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7"] Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.441452 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-75xtb"] Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.463445 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2vpqs"] Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.472447 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c"] Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.484638 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:03 crc kubenswrapper[4867]: E1006 13:06:03.486176 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:03.986161142 +0000 UTC m=+143.444109286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.589110 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:03 crc kubenswrapper[4867]: E1006 13:06:03.589558 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:04.089534103 +0000 UTC m=+143.547482247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.600472 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:03 crc kubenswrapper[4867]: E1006 13:06:03.600904 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:04.100890441 +0000 UTC m=+143.558838585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:03 crc kubenswrapper[4867]: W1006 13:06:03.608454 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e611b5d_e10c_4148_a5af_b62505d05e74.slice/crio-688836e7bac5a3ad1340e1b28a5831974e7647240cc55771a5f7d4d52daadf68 WatchSource:0}: Error finding container 688836e7bac5a3ad1340e1b28a5831974e7647240cc55771a5f7d4d52daadf68: Status 404 returned error can't find the container with id 688836e7bac5a3ad1340e1b28a5831974e7647240cc55771a5f7d4d52daadf68 Oct 06 13:06:03 crc kubenswrapper[4867]: W1006 13:06:03.626813 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod470ad2dd_46cd_49e5_ac59_032631bfcb0b.slice/crio-934f840be333983ab3f8f27b278b0eed02faf982206ca777441e98b0a943ab78 WatchSource:0}: Error finding container 934f840be333983ab3f8f27b278b0eed02faf982206ca777441e98b0a943ab78: Status 404 returned error can't find the container with id 934f840be333983ab3f8f27b278b0eed02faf982206ca777441e98b0a943ab78 Oct 06 13:06:03 crc kubenswrapper[4867]: W1006 13:06:03.655827 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2044b33_c26a_4abb_b304_d7fa7eaaec71.slice/crio-179c3b2fe81dda8e4e029c4aaf40838d3ad5501d2c43264a6c3d0ee805b25c3a WatchSource:0}: Error finding container 179c3b2fe81dda8e4e029c4aaf40838d3ad5501d2c43264a6c3d0ee805b25c3a: Status 404 returned error can't find the container with id 179c3b2fe81dda8e4e029c4aaf40838d3ad5501d2c43264a6c3d0ee805b25c3a Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.664855 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb"] Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.700985 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:03 crc kubenswrapper[4867]: E1006 13:06:03.701555 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:04.201540471 +0000 UTC m=+143.659488615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.701589 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dclxp"] Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.738235 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bf8xq"] Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.778062 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h5sdt" podStartSLOduration=123.778030427 podStartE2EDuration="2m3.778030427s" podCreationTimestamp="2025-10-06 13:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:03.772308737 +0000 UTC m=+143.230256881" watchObservedRunningTime="2025-10-06 13:06:03.778030427 +0000 UTC m=+143.235978571" Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.802385 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:03 crc kubenswrapper[4867]: E1006 13:06:03.802756 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:04.302743956 +0000 UTC m=+143.760692100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.903819 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:03 crc kubenswrapper[4867]: E1006 13:06:03.908133 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:04.40811518 +0000 UTC m=+143.866063324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.924084 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-x4d8v" podStartSLOduration=122.924070638 podStartE2EDuration="2m2.924070638s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:03.858125568 +0000 UTC m=+143.316073712" watchObservedRunningTime="2025-10-06 13:06:03.924070638 +0000 UTC m=+143.382018782" Oct 06 13:06:03 crc kubenswrapper[4867]: W1006 13:06:03.925648 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe98621b_3abf_47ac_a3f8_0abc2c371dcf.slice/crio-081e298fd1abb52d325f5dda5ec460e573b7d588df4fa7dde663eee4a8bb0ce1 WatchSource:0}: Error finding container 081e298fd1abb52d325f5dda5ec460e573b7d588df4fa7dde663eee4a8bb0ce1: Status 404 returned error can't find the container with id 081e298fd1abb52d325f5dda5ec460e573b7d588df4fa7dde663eee4a8bb0ce1 Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.996555 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hzns5"] Oct 06 13:06:03 crc kubenswrapper[4867]: I1006 13:06:03.998905 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" podStartSLOduration=122.99888669 podStartE2EDuration="2m2.99888669s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:03.993828568 +0000 UTC m=+143.451776712" watchObservedRunningTime="2025-10-06 13:06:03.99888669 +0000 UTC m=+143.456834834" Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.000000 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zl588"] Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.000043 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-svjt7" event={"ID":"dc7ecd08-018a-482c-be65-b08bdcbf2ed6","Type":"ContainerStarted","Data":"c2f71d308d6fcc0d1c2aa7c61bd054b12d7fed97055909155d64e8f4327fa43c"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.000057 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-svjt7" event={"ID":"dc7ecd08-018a-482c-be65-b08bdcbf2ed6","Type":"ContainerStarted","Data":"c7c54730a31f038d5a77209469b3ae831cc228e16b743803bf8d1d6fb2e4b986"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.002134 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb" event={"ID":"a5d3717e-bfd0-4f99-9b43-f871cc3c1801","Type":"ContainerStarted","Data":"7618d348b21cf60c7bb42b55e5507c4aed7f1ca54655848da4fdfab9c6f0faa5"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.009273 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:04 crc kubenswrapper[4867]: E1006 13:06:04.009595 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:04.509582991 +0000 UTC m=+143.967531135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.021235 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" event={"ID":"f98ebfd5-26fa-49e3-a072-2578e344889b","Type":"ContainerStarted","Data":"74633c7dbdef2f2889ab2ec02926d090e38d26865dc95f767d76b009f01a05d1"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.039449 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" event={"ID":"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70","Type":"ContainerStarted","Data":"5e506349fff2469c8e3423b47eb8038e17760a320ae2e7795a155a7731e5a599"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.039488 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" event={"ID":"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70","Type":"ContainerStarted","Data":"14f9895cb0a8f9418ac6234f1d9b130be44ed53ed2cb1d5b75641af4e4724678"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.057814 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rqnc4" event={"ID":"a85dd45a-f972-4cb7-aa77-e2f8468df1cf","Type":"ContainerStarted","Data":"a6fb20885bc0e5c7d1b5eb63ef41fb249b47eb0aa91a5afb9019f71b5a93c690"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.058677 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c" event={"ID":"2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0","Type":"ContainerStarted","Data":"47c39519eeff0d717f79fec33d319eca8dfaf7e2df6e189d50809b2da4d39bc5"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.059652 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dclxp" event={"ID":"1204892b-a86d-4b14-9aca-1fcbd64c9cd2","Type":"ContainerStarted","Data":"47e374bd0764058e88442320cf4870c1e6646c6b3b7b213d1c2da5d946999d7b"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.060296 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" event={"ID":"8c451a50-f142-4702-91ac-987dc000746b","Type":"ContainerStarted","Data":"953c95847b498544b87294ca4f5946886b55f28658981b4fb9549302f8426d54"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.060900 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" event={"ID":"ed7648e1-d992-4263-9117-e50cd88a66a9","Type":"ContainerStarted","Data":"bf60190912be3fbdebd9c7cfdfd06ce57dcafbae224f8df5dbcf89aca5c169ca"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.073521 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" event={"ID":"fa021310-c3a2-4feb-93a7-0b2eb6307147","Type":"ContainerStarted","Data":"f9043228bbef51a1c80d575276e4a56f43447b754a13df57a16e473d586e5268"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.074624 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ncwsb" podStartSLOduration=123.074474993 podStartE2EDuration="2m3.074474993s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:04.039223658 +0000 UTC m=+143.497171802" watchObservedRunningTime="2025-10-06 13:06:04.074474993 +0000 UTC m=+143.532423147" Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.085952 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h" event={"ID":"048512de-ffeb-44c8-a11f-58513ae09db2","Type":"ContainerStarted","Data":"4f13cb9b8a4257aab015c1227e0e68cc020d9ca5da1cfc877192f58b80c7017c"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.090566 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rff6q" event={"ID":"e4f928da-a8de-4a0b-a256-0002c1b76e81","Type":"ContainerStarted","Data":"d2595fd9256adb73d8364d8bed0cf7de084d5cf93db40718f2d1d3f1ed6d21e2"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.096335 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" event={"ID":"470ad2dd-46cd-49e5-ac59-032631bfcb0b","Type":"ContainerStarted","Data":"934f840be333983ab3f8f27b278b0eed02faf982206ca777441e98b0a943ab78"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.110688 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:04 crc kubenswrapper[4867]: E1006 13:06:04.111541 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:04.611525605 +0000 UTC m=+144.069473749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.130079 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" event={"ID":"674661a9-9e17-4d57-b887-8294a70fdcad","Type":"ContainerStarted","Data":"d0b04e645424aee6037bf0a223bd74d3a3da7aa653d581974cebb1a82097b4e9"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.130329 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" event={"ID":"674661a9-9e17-4d57-b887-8294a70fdcad","Type":"ContainerStarted","Data":"d48ba02f827af1b199fe5e223c30ebdefedb3e99e196f5e645bfd10711d33fbc"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.132806 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.132924 4867 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j646b container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.132959 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" podUID="674661a9-9e17-4d57-b887-8294a70fdcad" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.179940 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" event={"ID":"b2e9bd61-4b86-4b8f-873e-6a143973f249","Type":"ContainerStarted","Data":"5b56246eb5b50ac6f31c1d94a332183ebcb054631734e21de37af28194567e23"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.186922 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vf59b" podStartSLOduration=123.186901452 podStartE2EDuration="2m3.186901452s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:04.171881288 +0000 UTC m=+143.629829432" watchObservedRunningTime="2025-10-06 13:06:04.186901452 +0000 UTC m=+143.644849596" Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.188829 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ggkgn"] Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.211936 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:04 crc kubenswrapper[4867]: E1006 13:06:04.212911 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:04.712880073 +0000 UTC m=+144.170828217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.222633 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fp56z"] Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.258083 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2vpqs" event={"ID":"d2044b33-c26a-4abb-b304-d7fa7eaaec71","Type":"ContainerStarted","Data":"179c3b2fe81dda8e4e029c4aaf40838d3ad5501d2c43264a6c3d0ee805b25c3a"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.269243 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" event={"ID":"1e611b5d-e10c-4148-a5af-b62505d05e74","Type":"ContainerStarted","Data":"688836e7bac5a3ad1340e1b28a5831974e7647240cc55771a5f7d4d52daadf68"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.303496 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" event={"ID":"fe98621b-3abf-47ac-a3f8-0abc2c371dcf","Type":"ContainerStarted","Data":"081e298fd1abb52d325f5dda5ec460e573b7d588df4fa7dde663eee4a8bb0ce1"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.314784 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:04 crc kubenswrapper[4867]: E1006 13:06:04.315132 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:04.815116635 +0000 UTC m=+144.273064779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.337302 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" event={"ID":"497dc758-b39a-4915-8eac-e4cb9e2e2a80","Type":"ContainerStarted","Data":"e2ce20b0786a60fddd1631153e4cc4f07426ad395cced31a34bcd8fafe4d997b"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.337343 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" event={"ID":"497dc758-b39a-4915-8eac-e4cb9e2e2a80","Type":"ContainerStarted","Data":"a537a565cc148f7b8418382ab5da4b301bee49211e67a7b3b8338340e6a76fc0"} Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.337967 4867 patch_prober.go:28] interesting pod/downloads-7954f5f757-x4d8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.337998 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x4d8v" podUID="8826a928-e7d1-4cb1-bd08-69849ee5a12b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.409443 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.419371 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:04 crc kubenswrapper[4867]: E1006 13:06:04.422718 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:04.922706127 +0000 UTC m=+144.380654271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.522863 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:04 crc kubenswrapper[4867]: E1006 13:06:04.523185 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:05.023165302 +0000 UTC m=+144.481113446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.539783 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rwkhc" podStartSLOduration=123.539766457 podStartE2EDuration="2m3.539766457s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:04.538767761 +0000 UTC m=+143.996715905" watchObservedRunningTime="2025-10-06 13:06:04.539766457 +0000 UTC m=+143.997714601" Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.622029 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx"] Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.626423 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:04 crc kubenswrapper[4867]: E1006 13:06:04.626712 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:05.126700138 +0000 UTC m=+144.584648282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.643025 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kv4q6"] Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.670168 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-zpvzr"] Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.686758 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ghwjb"] Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.693966 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4k5tp"] Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.701909 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq"] Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.704315 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xj8lg"] Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.713712 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp"] Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.730220 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:04 crc kubenswrapper[4867]: E1006 13:06:04.730621 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:05.230575972 +0000 UTC m=+144.688524116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.730747 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:04 crc kubenswrapper[4867]: E1006 13:06:04.731195 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:05.231187318 +0000 UTC m=+144.689135462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.735739 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc"] Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.735905 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44"] Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.756077 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw"] Oct 06 13:06:04 crc kubenswrapper[4867]: W1006 13:06:04.812412 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e1c41fc_3e0d_4048_97d2_54c54bc065e5.slice/crio-858e07a946a4fcc0475b2e70cf9b00e347fafe163324d1c0e8099989e3bad15c WatchSource:0}: Error finding container 858e07a946a4fcc0475b2e70cf9b00e347fafe163324d1c0e8099989e3bad15c: Status 404 returned error can't find the container with id 858e07a946a4fcc0475b2e70cf9b00e347fafe163324d1c0e8099989e3bad15c Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.815012 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8vp8h" podStartSLOduration=123.814994516 podStartE2EDuration="2m3.814994516s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:04.813434595 +0000 UTC m=+144.271382739" watchObservedRunningTime="2025-10-06 13:06:04.814994516 +0000 UTC m=+144.272942660" Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.833474 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:04 crc kubenswrapper[4867]: E1006 13:06:04.834173 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:05.334150669 +0000 UTC m=+144.792098823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.834782 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:04 crc kubenswrapper[4867]: E1006 13:06:04.835366 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:05.335357111 +0000 UTC m=+144.793305255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.835637 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:04 crc kubenswrapper[4867]: W1006 13:06:04.841372 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaab82a97_184d_4b45_b051_f6fbdb925819.slice/crio-2c6b108846c561ccf4fe1e7b611f9033eff31653c44135598bc0963b5108af2d WatchSource:0}: Error finding container 2c6b108846c561ccf4fe1e7b611f9033eff31653c44135598bc0963b5108af2d: Status 404 returned error can't find the container with id 2c6b108846c561ccf4fe1e7b611f9033eff31653c44135598bc0963b5108af2d Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.841437 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:04 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:04 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:04 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.841468 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.909184 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-rff6q" podStartSLOduration=5.9091689469999995 podStartE2EDuration="5.909168947s" podCreationTimestamp="2025-10-06 13:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:04.85554885 +0000 UTC m=+144.313496994" watchObservedRunningTime="2025-10-06 13:06:04.909168947 +0000 UTC m=+144.367117091" Oct 06 13:06:04 crc kubenswrapper[4867]: I1006 13:06:04.937655 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:04 crc kubenswrapper[4867]: E1006 13:06:04.938066 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:05.438050084 +0000 UTC m=+144.895998228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.049541 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:05 crc kubenswrapper[4867]: E1006 13:06:05.049942 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:05.549929449 +0000 UTC m=+145.007877593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.065334 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sqlvm" podStartSLOduration=124.065314172 podStartE2EDuration="2m4.065314172s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:05.057376494 +0000 UTC m=+144.515324638" watchObservedRunningTime="2025-10-06 13:06:05.065314172 +0000 UTC m=+144.523262316" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.087202 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vf59b" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.150802 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:05 crc kubenswrapper[4867]: E1006 13:06:05.151145 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:05.651130723 +0000 UTC m=+145.109078867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.255348 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:05 crc kubenswrapper[4867]: E1006 13:06:05.256450 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:05.756434505 +0000 UTC m=+145.214382649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.341522 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-svjt7" podStartSLOduration=124.341504886 podStartE2EDuration="2m4.341504886s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:05.295487099 +0000 UTC m=+144.753435263" watchObservedRunningTime="2025-10-06 13:06:05.341504886 +0000 UTC m=+144.799453030" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.358904 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:05 crc kubenswrapper[4867]: E1006 13:06:05.359360 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:05.859345124 +0000 UTC m=+145.317293268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.380410 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" podStartSLOduration=124.380392366 podStartE2EDuration="2m4.380392366s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:05.378210349 +0000 UTC m=+144.836158493" watchObservedRunningTime="2025-10-06 13:06:05.380392366 +0000 UTC m=+144.838340510" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.386022 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" event={"ID":"0e1c41fc-3e0d-4048-97d2-54c54bc065e5","Type":"ContainerStarted","Data":"858e07a946a4fcc0475b2e70cf9b00e347fafe163324d1c0e8099989e3bad15c"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.405705 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" event={"ID":"1d606431-b258-4baa-bed9-95e82471aa02","Type":"ContainerStarted","Data":"064eef27fe45e5efd3d201b0214510c9099048a844d82c92a1f3885e2b978663"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.460216 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44" event={"ID":"48abb7a8-e725-44c3-b95a-25c70999773f","Type":"ContainerStarted","Data":"2275d86b3eae738cafc855a183b3e11342a812232f8e1e1933436ebdf5846292"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.460498 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:05 crc kubenswrapper[4867]: E1006 13:06:05.460740 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:05.960728454 +0000 UTC m=+145.418676598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.477367 4867 generic.go:334] "Generic (PLEG): container finished" podID="1e611b5d-e10c-4148-a5af-b62505d05e74" containerID="26ea3eb2f6a7213a23704b5ee98810c6441c97d3446bcedf3ac108e9b9d79914" exitCode=0 Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.477446 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" event={"ID":"1e611b5d-e10c-4148-a5af-b62505d05e74","Type":"ContainerDied","Data":"26ea3eb2f6a7213a23704b5ee98810c6441c97d3446bcedf3ac108e9b9d79914"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.479951 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" event={"ID":"9b0008a7-3dfa-4e42-be35-3493c341fc69","Type":"ContainerStarted","Data":"398d8db45d4f295d440fe05ebec7aece93ecc3258209f64f57772292bfcb7aba"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.524438 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp" event={"ID":"aab82a97-184d-4b45-b051-f6fbdb925819","Type":"ContainerStarted","Data":"2c6b108846c561ccf4fe1e7b611f9033eff31653c44135598bc0963b5108af2d"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.538291 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" event={"ID":"8c451a50-f142-4702-91ac-987dc000746b","Type":"ContainerStarted","Data":"33f235c0785558451d24dd6c3a3d49a811199c2dc6923f20920f64484e9b023c"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.539305 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.547547 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.558857 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" event={"ID":"598298ab-238a-4776-9e23-e66c273dc805","Type":"ContainerStarted","Data":"f839f62f8796c1696c1090e47f4b5596529cf9e255a65862017e07880a0fd188"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.562417 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:05 crc kubenswrapper[4867]: E1006 13:06:05.562718 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:06.062688328 +0000 UTC m=+145.520636472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.562779 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:05 crc kubenswrapper[4867]: E1006 13:06:05.563367 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:06.063355645 +0000 UTC m=+145.521303789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.574235 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h9grd" podStartSLOduration=124.57421452 podStartE2EDuration="2m4.57421452s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:05.572147576 +0000 UTC m=+145.030095720" watchObservedRunningTime="2025-10-06 13:06:05.57421452 +0000 UTC m=+145.032162664" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.577007 4867 generic.go:334] "Generic (PLEG): container finished" podID="470ad2dd-46cd-49e5-ac59-032631bfcb0b" containerID="607a4b96b9aaf676c702c2e8e6749f2e96680c0e4d5d1df2aa85b33a236c7fcf" exitCode=0 Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.577091 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" event={"ID":"470ad2dd-46cd-49e5-ac59-032631bfcb0b","Type":"ContainerDied","Data":"607a4b96b9aaf676c702c2e8e6749f2e96680c0e4d5d1df2aa85b33a236c7fcf"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.587784 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" event={"ID":"497dc758-b39a-4915-8eac-e4cb9e2e2a80","Type":"ContainerStarted","Data":"da87a8693f7819db6c066a4d2ba12419e28828e8e57d1d78cdd8da8d2db7855c"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.601955 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hzns5" event={"ID":"3f1a0563-0bdb-42ae-8754-ef1f299414d8","Type":"ContainerStarted","Data":"e99e806d6d0ba97e48dfa77f4641c8479b1b416e2dbef2ffc72e040b091eade5"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.602003 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hzns5" event={"ID":"3f1a0563-0bdb-42ae-8754-ef1f299414d8","Type":"ContainerStarted","Data":"c483ed5e5b2af28dc18278f67c63a9b9bf7c617375e99cce7814b09dc17ab997"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.621542 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fp56z" event={"ID":"67691fb1-2774-4035-91b1-ca6c26fa7de6","Type":"ContainerStarted","Data":"438e5741a47698470beeca0c31f36aa2ec96edea73742d7c7f8c1199584d3162"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.621596 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fp56z" event={"ID":"67691fb1-2774-4035-91b1-ca6c26fa7de6","Type":"ContainerStarted","Data":"776048c05d7bc1b2d3873a63c4cc1c500f51f6f43ed2681303d08ba36803e72a"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.652544 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ggkgn" event={"ID":"59dff37a-5c33-4cb3-a10d-e73fe741d7e3","Type":"ContainerStarted","Data":"0bfb42b81b859fda391f3444cbe60d1043ac5f0cc6ab53fe30236c193140e29b"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.652592 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ggkgn" event={"ID":"59dff37a-5c33-4cb3-a10d-e73fe741d7e3","Type":"ContainerStarted","Data":"881f111918c69966226676e1d80db43f970da0da8d4ab30f0b10c887d384b461"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.665747 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:05 crc kubenswrapper[4867]: E1006 13:06:05.667190 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:06.167167938 +0000 UTC m=+145.625116122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.688725 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c" event={"ID":"2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0","Type":"ContainerStarted","Data":"d59697b169e53bc373a7acd1b8b6a7eecd5c83f4be6e26b6ad373c6c3a17dd9e"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.697662 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" podStartSLOduration=124.697630717 podStartE2EDuration="2m4.697630717s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:05.65351499 +0000 UTC m=+145.111463134" watchObservedRunningTime="2025-10-06 13:06:05.697630717 +0000 UTC m=+145.155578861" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.711638 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" event={"ID":"fa021310-c3a2-4feb-93a7-0b2eb6307147","Type":"ContainerStarted","Data":"eff23dc662105c5333071fa89927258b06bbb2f3d97487fef7d0dbec738a61c9"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.712329 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.716811 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-ggkgn" podStartSLOduration=124.7167911 podStartE2EDuration="2m4.7167911s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:05.711532412 +0000 UTC m=+145.169480556" watchObservedRunningTime="2025-10-06 13:06:05.7167911 +0000 UTC m=+145.174739244" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.754352 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ghwjb" event={"ID":"4c8dd3b2-db00-42b3-a439-43360407c2bd","Type":"ContainerStarted","Data":"f7a302e027f6c0766bc9d2375577efcd23f4a666b7e85bf27201bdba6259b331"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.754419 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ghwjb" event={"ID":"4c8dd3b2-db00-42b3-a439-43360407c2bd","Type":"ContainerStarted","Data":"c3ce86d8000a305a8e7aa71c42fbfd5267dd35ee8df73c33d6637b1b2ef93f58"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.767084 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:05 crc kubenswrapper[4867]: E1006 13:06:05.769372 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:06.269354489 +0000 UTC m=+145.727302633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.775354 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fp56z" podStartSLOduration=124.762878789 podStartE2EDuration="2m4.762878789s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:05.758910295 +0000 UTC m=+145.216858439" watchObservedRunningTime="2025-10-06 13:06:05.762878789 +0000 UTC m=+145.220826933" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.787200 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zkcnc" podStartSLOduration=124.787178806 podStartE2EDuration="2m4.787178806s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:05.786504619 +0000 UTC m=+145.244452763" watchObservedRunningTime="2025-10-06 13:06:05.787178806 +0000 UTC m=+145.245126950" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.796792 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" event={"ID":"8fd5e314-f3ad-4b3f-8b7c-a9df02a01c70","Type":"ContainerStarted","Data":"9ba7983059cbd0e3314c87cb93fb7d23236aee426bf35afa7a3c17fef7136a53"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.825552 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rqnc4" event={"ID":"a85dd45a-f972-4cb7-aa77-e2f8468df1cf","Type":"ContainerStarted","Data":"1d355abbdeacdf024a119e2ba3c84a14285e1ff689b3146c12ffb2a0849b2923"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.839097 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" event={"ID":"5b5edafd-513a-4a3a-bbc8-b027d6afda2c","Type":"ContainerStarted","Data":"5ab1b2b19965366fbad63224b1e696a2c594c1fa0e7df23ae240962f842c316c"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.840226 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" podStartSLOduration=124.840213977 podStartE2EDuration="2m4.840213977s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:05.837235839 +0000 UTC m=+145.295183983" watchObservedRunningTime="2025-10-06 13:06:05.840213977 +0000 UTC m=+145.298162121" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.841224 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:05 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:05 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:05 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.841284 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.865937 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kv4q6" event={"ID":"e2b1c8fc-889a-4fd7-b34a-72c9c6575b93","Type":"ContainerStarted","Data":"f5c3f4c04f8f55c7e5e576d932292fed383d8fa7a94f83b4d53d1ca55b171d0e"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.869743 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:05 crc kubenswrapper[4867]: E1006 13:06:05.871345 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:06.371328753 +0000 UTC m=+145.829276897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.950976 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb" event={"ID":"a5d3717e-bfd0-4f99-9b43-f871cc3c1801","Type":"ContainerStarted","Data":"2790490529c266af3743835f5369600310e6d3cba82ab8f3e69cbebe1584d027"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.951633 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ghwjb" podStartSLOduration=6.951614939 podStartE2EDuration="6.951614939s" podCreationTimestamp="2025-10-06 13:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:05.894421549 +0000 UTC m=+145.352369693" watchObservedRunningTime="2025-10-06 13:06:05.951614939 +0000 UTC m=+145.409563083" Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.968956 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2vpqs" event={"ID":"d2044b33-c26a-4abb-b304-d7fa7eaaec71","Type":"ContainerStarted","Data":"89e297cdc3394284bb8dc21d66ddbbea8ae5df1ca90eac906f080d363df8ff64"} Oct 06 13:06:05 crc kubenswrapper[4867]: I1006 13:06:05.971267 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:05 crc kubenswrapper[4867]: E1006 13:06:05.971524 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:06.471513811 +0000 UTC m=+145.929461955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.016579 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx" event={"ID":"217a87bd-eeb4-49fa-ad25-c8d35ede1637","Type":"ContainerStarted","Data":"37282d7975cd0771c514b9f3a74eecf40085e2833aaab44e36eec16ca89755cd"} Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.018717 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rqnc4" podStartSLOduration=125.018706199 podStartE2EDuration="2m5.018706199s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:05.957490173 +0000 UTC m=+145.415438317" watchObservedRunningTime="2025-10-06 13:06:06.018706199 +0000 UTC m=+145.476654343" Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.024021 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" event={"ID":"fe98621b-3abf-47ac-a3f8-0abc2c371dcf","Type":"ContainerStarted","Data":"02d26ce42c7ccac936bf00b5b13e288d0f40b1e687a413172f9ba8fa68ebb675"} Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.030820 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dclxp" event={"ID":"1204892b-a86d-4b14-9aca-1fcbd64c9cd2","Type":"ContainerStarted","Data":"2245320ad426103f69292b768d62315c7c5d4e80b23a3e5c821f640ee7c908fc"} Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.032176 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4k5tp" event={"ID":"b0af59da-ee70-4965-88c1-422936f4156d","Type":"ContainerStarted","Data":"71276a4454e77255b2578a0f71bf365579391c67858380ccd1c8f87702c51e41"} Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.082550 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:06 crc kubenswrapper[4867]: E1006 13:06:06.084002 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:06.583967291 +0000 UTC m=+146.041915435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.085353 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pklsp" event={"ID":"21b2c8cf-2109-4663-bdb0-b106d2a4c548","Type":"ContainerStarted","Data":"87d852656e37673aeb09565117af2de86ebd7c569b59ce8c2c518a35161e7cac"} Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.131046 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8clxg" podStartSLOduration=125.131015935 podStartE2EDuration="2m5.131015935s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:06.032396468 +0000 UTC m=+145.490344612" watchObservedRunningTime="2025-10-06 13:06:06.131015935 +0000 UTC m=+145.588964079" Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.166511 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" event={"ID":"ed7648e1-d992-4263-9117-e50cd88a66a9","Type":"ContainerStarted","Data":"d0c6ce9a09e0a1c79104dc597e6dfb64b876500727fbc0c1278eea8a9528df6c"} Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.184141 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:06 crc kubenswrapper[4867]: E1006 13:06:06.184597 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:06.68457578 +0000 UTC m=+146.142523934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.243926 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-bf8xq" podStartSLOduration=125.243883585 podStartE2EDuration="2m5.243883585s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:06.132391541 +0000 UTC m=+145.590339695" watchObservedRunningTime="2025-10-06 13:06:06.243883585 +0000 UTC m=+145.701831729" Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.245374 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dclxp" podStartSLOduration=125.245365234 podStartE2EDuration="2m5.245365234s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:06.235929387 +0000 UTC m=+145.693877551" watchObservedRunningTime="2025-10-06 13:06:06.245365234 +0000 UTC m=+145.703313378" Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.268677 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rff6q" event={"ID":"e4f928da-a8de-4a0b-a256-0002c1b76e81","Type":"ContainerStarted","Data":"aac4904001c84eb333acbe44f0582e986113b38ec21098d3e346e869db2ce057"} Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.285838 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:06 crc kubenswrapper[4867]: E1006 13:06:06.286205 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:06.786167124 +0000 UTC m=+146.244115268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.287981 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" event={"ID":"b2e9bd61-4b86-4b8f-873e-6a143973f249","Type":"ContainerStarted","Data":"06c8e0cf925e714454ca24242c668de0b9c9012138bb7a1fc31502c4dbf0c16c"} Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.342899 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" event={"ID":"b9f0a917-6145-4144-b761-62695806f129","Type":"ContainerStarted","Data":"f11fd282e2ea3b426c9af5e9d6e8a190eefbef6c5a7042c467ad5fa569976e61"} Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.356181 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.389440 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:06 crc kubenswrapper[4867]: E1006 13:06:06.392882 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:06.892865483 +0000 UTC m=+146.350813627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.416162 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.427268 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkxnb" podStartSLOduration=125.427226604 podStartE2EDuration="2m5.427226604s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:06.368281138 +0000 UTC m=+145.826229282" watchObservedRunningTime="2025-10-06 13:06:06.427226604 +0000 UTC m=+145.885174748" Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.499683 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:06 crc kubenswrapper[4867]: E1006 13:06:06.501197 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:07.001165934 +0000 UTC m=+146.459114078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.565815 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" podStartSLOduration=125.565783318 podStartE2EDuration="2m5.565783318s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:06.499408738 +0000 UTC m=+145.957356882" watchObservedRunningTime="2025-10-06 13:06:06.565783318 +0000 UTC m=+146.023731462" Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.601938 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:06 crc kubenswrapper[4867]: E1006 13:06:06.602619 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:07.102582354 +0000 UTC m=+146.560530488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.620828 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-87z4s" podStartSLOduration=125.620808862 podStartE2EDuration="2m5.620808862s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:06.565576443 +0000 UTC m=+146.023524587" watchObservedRunningTime="2025-10-06 13:06:06.620808862 +0000 UTC m=+146.078757006" Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.703053 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:06 crc kubenswrapper[4867]: E1006 13:06:06.703525 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:07.203509321 +0000 UTC m=+146.661457465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.813449 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:06 crc kubenswrapper[4867]: E1006 13:06:06.813836 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:07.313823374 +0000 UTC m=+146.771771518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.839765 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:06 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:06 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:06 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.840194 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:06 crc kubenswrapper[4867]: I1006 13:06:06.914461 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:06 crc kubenswrapper[4867]: E1006 13:06:06.914823 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:07.414809373 +0000 UTC m=+146.872757517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.016340 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:07 crc kubenswrapper[4867]: E1006 13:06:07.016752 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:07.516734036 +0000 UTC m=+146.974682180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.117900 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:07 crc kubenswrapper[4867]: E1006 13:06:07.118441 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:07.618409813 +0000 UTC m=+147.076357957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.220028 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:07 crc kubenswrapper[4867]: E1006 13:06:07.220694 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:07.720663965 +0000 UTC m=+147.178612109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.321900 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:07 crc kubenswrapper[4867]: E1006 13:06:07.322321 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:07.822307551 +0000 UTC m=+147.280255695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.357672 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx" event={"ID":"217a87bd-eeb4-49fa-ad25-c8d35ede1637","Type":"ContainerStarted","Data":"71c512faab50ed6c1ac797ef9919928e5967e34febb4606d2009c7e4ae6e7090"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.357736 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx" event={"ID":"217a87bd-eeb4-49fa-ad25-c8d35ede1637","Type":"ContainerStarted","Data":"70183d188d0ab72e2ef6b0006c06ad7c7d00fde980308bc7fd2f7beaa5679ba5"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.358791 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.362272 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kv4q6" event={"ID":"e2b1c8fc-889a-4fd7-b34a-72c9c6575b93","Type":"ContainerStarted","Data":"faf5d42eeb6f664a3a290b04827fdfa0d643a42bf0f0fe4243e890c2fbf70638"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.362325 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kv4q6" event={"ID":"e2b1c8fc-889a-4fd7-b34a-72c9c6575b93","Type":"ContainerStarted","Data":"21f5c0502edc1077fef9dceb72926c25b9852f460166165c620b59844a970130"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.384235 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c" event={"ID":"2cfb6927-e0b7-4f5b-a3d4-feba89ec80c0","Type":"ContainerStarted","Data":"1902eb454c6980ab114e91ffab3bcce97a711e6a241c8545fa2b293fab46032a"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.386165 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" event={"ID":"598298ab-238a-4776-9e23-e66c273dc805","Type":"ContainerStarted","Data":"c72ea94e655d936e618651192f8799ebba998eb0cb6590bad3bc3db720987234"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.391089 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" event={"ID":"1d606431-b258-4baa-bed9-95e82471aa02","Type":"ContainerStarted","Data":"4436ed140fbf2d75dbdcb401092a2d0f7a6585d473f17dd44cab5dae047bfd91"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.391115 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" event={"ID":"1d606431-b258-4baa-bed9-95e82471aa02","Type":"ContainerStarted","Data":"0c80cd2559d653cf428cea289512715decec1a49397f24dae2718b7a711c7bf8"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.393766 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp" event={"ID":"aab82a97-184d-4b45-b051-f6fbdb925819","Type":"ContainerStarted","Data":"8a1d5b57cd4f7c956418b521ad98458d75636dc9545bbacdd22c80d4747fc867"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.396662 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" event={"ID":"b9f0a917-6145-4144-b761-62695806f129","Type":"ContainerStarted","Data":"91fea7f66a5cf8e2772bbc1a55095ea5687da08354b3f8c0ae7094d056f794c9"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.397191 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.398915 4867 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-l64rw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.398965 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" podUID="b9f0a917-6145-4144-b761-62695806f129" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.401616 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" event={"ID":"1e611b5d-e10c-4148-a5af-b62505d05e74","Type":"ContainerStarted","Data":"71ffe2d7e544442413391166a76c532ee3b6efbb1fdf758866f3d73a74f56a6e"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.404359 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx" podStartSLOduration=126.404349473 podStartE2EDuration="2m6.404349473s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:07.402623877 +0000 UTC m=+146.860572021" watchObservedRunningTime="2025-10-06 13:06:07.404349473 +0000 UTC m=+146.862297617" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.406758 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" event={"ID":"470ad2dd-46cd-49e5-ac59-032631bfcb0b","Type":"ContainerStarted","Data":"254aca77ec7f0fd6dee087906fac78b5899da7a98ca6f3ab72fa8fe49559d3b4"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.407357 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.412166 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pklsp" event={"ID":"21b2c8cf-2109-4663-bdb0-b106d2a4c548","Type":"ContainerStarted","Data":"995b7b9be3a0d19532221b9bade52c825f5a3010b0ddcfcd5494ba023845aba1"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.421049 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8lg7n" event={"ID":"ed7648e1-d992-4263-9117-e50cd88a66a9","Type":"ContainerStarted","Data":"ede57db86286cf452021cfc705668621b361bdb4eae9007cc9d7c6c3e536ad05"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.427200 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:07 crc kubenswrapper[4867]: E1006 13:06:07.427613 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:07.927588672 +0000 UTC m=+147.385536816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.427931 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" event={"ID":"0e1c41fc-3e0d-4048-97d2-54c54bc065e5","Type":"ContainerStarted","Data":"fd204990e6cadcbd504a59fa94e64c39886451d8a0e74d818b22ecbde7380614"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.428995 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.434451 4867 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xj8lg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.434503 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" podUID="0e1c41fc-3e0d-4048-97d2-54c54bc065e5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.439245 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44" event={"ID":"48abb7a8-e725-44c3-b95a-25c70999773f","Type":"ContainerStarted","Data":"117f6cb258e68256dff4b8f72d1277806ebedba3e0eaf27291331607ccd54545"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.450905 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kv4q6" podStartSLOduration=126.450878273 podStartE2EDuration="2m6.450878273s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:07.448765348 +0000 UTC m=+146.906713492" watchObservedRunningTime="2025-10-06 13:06:07.450878273 +0000 UTC m=+146.908826417" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.453358 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2vpqs" event={"ID":"d2044b33-c26a-4abb-b304-d7fa7eaaec71","Type":"ContainerStarted","Data":"bc968c071e5613874c304d3dcccadf29b9ac9f3ef69b2a21b45cc7dcea2b82cd"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.459344 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hzns5" event={"ID":"3f1a0563-0bdb-42ae-8754-ef1f299414d8","Type":"ContainerStarted","Data":"126732c6e9bb98e5be92a1e119a88cee82f3c3c4fcbb2e592571424b39733162"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.466759 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" event={"ID":"5b5edafd-513a-4a3a-bbc8-b027d6afda2c","Type":"ContainerStarted","Data":"289b046b9ca70eb7e0f48ac329b99e37b0caa5f3e3cc831f762947ce018d444f"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.471128 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" event={"ID":"9b0008a7-3dfa-4e42-be35-3493c341fc69","Type":"ContainerStarted","Data":"0b241e34ed6447a0bf15aa3fabb99225e20ce2bf15eaa9f0b0f66460c2108f35"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.471678 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.478450 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4k5tp" event={"ID":"b0af59da-ee70-4965-88c1-422936f4156d","Type":"ContainerStarted","Data":"86e5151f8b78e3897847ef94b1285fa7a9d736b0bbdb35c9f5d6a55619878c53"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.478501 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4k5tp" event={"ID":"b0af59da-ee70-4965-88c1-422936f4156d","Type":"ContainerStarted","Data":"d2f9b5bdc45afee3d914603722ad8c1099f0e5d76d87dfb79881ed7928abfcad"} Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.478531 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4k5tp" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.479445 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zszfp" podStartSLOduration=126.479433182 podStartE2EDuration="2m6.479433182s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:07.477385618 +0000 UTC m=+146.935333762" watchObservedRunningTime="2025-10-06 13:06:07.479433182 +0000 UTC m=+146.937381326" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.500816 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.527666 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zl588" podStartSLOduration=126.527647737 podStartE2EDuration="2m6.527647737s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:07.527076502 +0000 UTC m=+146.985024646" watchObservedRunningTime="2025-10-06 13:06:07.527647737 +0000 UTC m=+146.985595881" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.528465 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:07 crc kubenswrapper[4867]: E1006 13:06:07.528867 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.028840778 +0000 UTC m=+147.486788912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.529176 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:07 crc kubenswrapper[4867]: E1006 13:06:07.533949 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.033936642 +0000 UTC m=+147.491884786 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.631703 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.633180 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:07 crc kubenswrapper[4867]: E1006 13:06:07.633509 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.133470342 +0000 UTC m=+147.591418486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.638099 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.679188 4867 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-nplc7 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.679353 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" podUID="1e611b5d-e10c-4148-a5af-b62505d05e74" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.700481 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-s9l2c" podStartSLOduration=126.700459419 podStartE2EDuration="2m6.700459419s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:07.586618983 +0000 UTC m=+147.044567117" watchObservedRunningTime="2025-10-06 13:06:07.700459419 +0000 UTC m=+147.158407563" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.701421 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" podStartSLOduration=126.701414134 podStartE2EDuration="2m6.701414134s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:07.697279396 +0000 UTC m=+147.155227540" watchObservedRunningTime="2025-10-06 13:06:07.701414134 +0000 UTC m=+147.159362278" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.734220 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:07 crc kubenswrapper[4867]: E1006 13:06:07.734623 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.234612375 +0000 UTC m=+147.692560519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.772130 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" podStartSLOduration=126.772112389 podStartE2EDuration="2m6.772112389s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:07.771630036 +0000 UTC m=+147.229578180" watchObservedRunningTime="2025-10-06 13:06:07.772112389 +0000 UTC m=+147.230060533" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.835360 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:07 crc kubenswrapper[4867]: E1006 13:06:07.835545 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.335515272 +0000 UTC m=+147.793463416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.835669 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:07 crc kubenswrapper[4867]: E1006 13:06:07.836136 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.336113138 +0000 UTC m=+147.794061282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.849496 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hzns5" podStartSLOduration=126.849476758 podStartE2EDuration="2m6.849476758s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:07.843910192 +0000 UTC m=+147.301858336" watchObservedRunningTime="2025-10-06 13:06:07.849476758 +0000 UTC m=+147.307424902" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.852798 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:07 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:07 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:07 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.852886 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.918379 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" podStartSLOduration=126.918351745 podStartE2EDuration="2m6.918351745s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:07.914382971 +0000 UTC m=+147.372331125" watchObservedRunningTime="2025-10-06 13:06:07.918351745 +0000 UTC m=+147.376299889" Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.937490 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:07 crc kubenswrapper[4867]: E1006 13:06:07.937725 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.437695442 +0000 UTC m=+147.895643586 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:07 crc kubenswrapper[4867]: I1006 13:06:07.938078 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:07 crc kubenswrapper[4867]: E1006 13:06:07.938547 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.438530194 +0000 UTC m=+147.896478338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.008200 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" podStartSLOduration=127.008180431 podStartE2EDuration="2m7.008180431s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:07.969929567 +0000 UTC m=+147.427877721" watchObservedRunningTime="2025-10-06 13:06:08.008180431 +0000 UTC m=+147.466128565" Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.008618 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tcmnq" podStartSLOduration=127.008613662 podStartE2EDuration="2m7.008613662s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:08.00853491 +0000 UTC m=+147.466483054" watchObservedRunningTime="2025-10-06 13:06:08.008613662 +0000 UTC m=+147.466561806" Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.039563 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.039715 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.539689507 +0000 UTC m=+147.997637651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.040048 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.040424 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.540409116 +0000 UTC m=+147.998357260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.065969 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4k5tp" podStartSLOduration=9.065941206 podStartE2EDuration="9.065941206s" podCreationTimestamp="2025-10-06 13:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:08.063684877 +0000 UTC m=+147.521633021" watchObservedRunningTime="2025-10-06 13:06:08.065941206 +0000 UTC m=+147.523889350" Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.141132 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.141388 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.641338803 +0000 UTC m=+148.099286947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.141473 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.141813 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.641797535 +0000 UTC m=+148.099745669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.171496 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-pklsp" podStartSLOduration=128.171472924 podStartE2EDuration="2m8.171472924s" podCreationTimestamp="2025-10-06 13:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:08.169326787 +0000 UTC m=+147.627274931" watchObservedRunningTime="2025-10-06 13:06:08.171472924 +0000 UTC m=+147.629421068" Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.243090 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.243363 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.743322238 +0000 UTC m=+148.201270382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.265546 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8kf44" podStartSLOduration=127.265520221 podStartE2EDuration="2m7.265520221s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:08.260415797 +0000 UTC m=+147.718363941" watchObservedRunningTime="2025-10-06 13:06:08.265520221 +0000 UTC m=+147.723468365" Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.301203 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-2vpqs" podStartSLOduration=127.301182976 podStartE2EDuration="2m7.301182976s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:08.298132666 +0000 UTC m=+147.756080810" watchObservedRunningTime="2025-10-06 13:06:08.301182976 +0000 UTC m=+147.759131120" Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.346093 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.346401 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.846389452 +0000 UTC m=+148.304337596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.447318 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.447533 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.947507114 +0000 UTC m=+148.405455258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.447627 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.447951 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:08.947942305 +0000 UTC m=+148.405890439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.484012 4867 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xj8lg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.484070 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" podUID="0e1c41fc-3e0d-4048-97d2-54c54bc065e5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.548554 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.548744 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.048713119 +0000 UTC m=+148.506661263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.549036 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.552689 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.052670832 +0000 UTC m=+148.510618966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.650522 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.650703 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.150677503 +0000 UTC m=+148.608625647 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.650872 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.651229 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.151213427 +0000 UTC m=+148.609161571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.752298 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.752547 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.252506604 +0000 UTC m=+148.710454748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.752904 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.753352 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.253344016 +0000 UTC m=+148.711292160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.839901 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:08 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:08 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:08 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.840489 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.853587 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.853783 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.353748099 +0000 UTC m=+148.811696243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.853850 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.854188 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.354172641 +0000 UTC m=+148.812120995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.955179 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.955391 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.455355495 +0000 UTC m=+148.913303639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:08 crc kubenswrapper[4867]: I1006 13:06:08.955533 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:08 crc kubenswrapper[4867]: E1006 13:06:08.955965 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.45595779 +0000 UTC m=+148.913905934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.056937 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:09 crc kubenswrapper[4867]: E1006 13:06:09.057227 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.557183925 +0000 UTC m=+149.015132069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.057326 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.057426 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.057530 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.057614 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.060463 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.069205 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.073151 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.073746 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.110522 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cwk8q"] Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.119221 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.123113 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.133000 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cwk8q"] Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.137073 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.144628 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.158522 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.159518 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:09 crc kubenswrapper[4867]: E1006 13:06:09.159809 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.659798847 +0000 UTC m=+149.117746991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.210529 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l64rw" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.260193 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.260445 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7971c0f6-6d96-4e00-9bda-003c54ad5d53-utilities\") pod \"community-operators-cwk8q\" (UID: \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\") " pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.260517 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n72zz\" (UniqueName: \"kubernetes.io/projected/7971c0f6-6d96-4e00-9bda-003c54ad5d53-kube-api-access-n72zz\") pod \"community-operators-cwk8q\" (UID: \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\") " pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.260538 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7971c0f6-6d96-4e00-9bda-003c54ad5d53-catalog-content\") pod \"community-operators-cwk8q\" (UID: \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\") " pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:09 crc kubenswrapper[4867]: E1006 13:06:09.260670 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.760650782 +0000 UTC m=+149.218598926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.299417 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zzpjw"] Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.300294 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.309775 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.338188 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zzpjw"] Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.362544 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.362618 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n72zz\" (UniqueName: \"kubernetes.io/projected/7971c0f6-6d96-4e00-9bda-003c54ad5d53-kube-api-access-n72zz\") pod \"community-operators-cwk8q\" (UID: \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\") " pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.362638 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7971c0f6-6d96-4e00-9bda-003c54ad5d53-catalog-content\") pod \"community-operators-cwk8q\" (UID: \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\") " pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.362738 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn296\" (UniqueName: \"kubernetes.io/projected/fa3ccdb7-011e-4f41-9599-dab85489545e-kube-api-access-wn296\") pod \"certified-operators-zzpjw\" (UID: \"fa3ccdb7-011e-4f41-9599-dab85489545e\") " pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.362763 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa3ccdb7-011e-4f41-9599-dab85489545e-utilities\") pod \"certified-operators-zzpjw\" (UID: \"fa3ccdb7-011e-4f41-9599-dab85489545e\") " pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.362790 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa3ccdb7-011e-4f41-9599-dab85489545e-catalog-content\") pod \"certified-operators-zzpjw\" (UID: \"fa3ccdb7-011e-4f41-9599-dab85489545e\") " pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.362848 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7971c0f6-6d96-4e00-9bda-003c54ad5d53-utilities\") pod \"community-operators-cwk8q\" (UID: \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\") " pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.363461 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7971c0f6-6d96-4e00-9bda-003c54ad5d53-utilities\") pod \"community-operators-cwk8q\" (UID: \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\") " pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:09 crc kubenswrapper[4867]: E1006 13:06:09.363955 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.863941471 +0000 UTC m=+149.321889615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.365150 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7971c0f6-6d96-4e00-9bda-003c54ad5d53-catalog-content\") pod \"community-operators-cwk8q\" (UID: \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\") " pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.405283 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n72zz\" (UniqueName: \"kubernetes.io/projected/7971c0f6-6d96-4e00-9bda-003c54ad5d53-kube-api-access-n72zz\") pod \"community-operators-cwk8q\" (UID: \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\") " pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.448559 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.464387 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.464658 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn296\" (UniqueName: \"kubernetes.io/projected/fa3ccdb7-011e-4f41-9599-dab85489545e-kube-api-access-wn296\") pod \"certified-operators-zzpjw\" (UID: \"fa3ccdb7-011e-4f41-9599-dab85489545e\") " pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.464687 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa3ccdb7-011e-4f41-9599-dab85489545e-utilities\") pod \"certified-operators-zzpjw\" (UID: \"fa3ccdb7-011e-4f41-9599-dab85489545e\") " pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.464709 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa3ccdb7-011e-4f41-9599-dab85489545e-catalog-content\") pod \"certified-operators-zzpjw\" (UID: \"fa3ccdb7-011e-4f41-9599-dab85489545e\") " pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.465171 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa3ccdb7-011e-4f41-9599-dab85489545e-catalog-content\") pod \"certified-operators-zzpjw\" (UID: \"fa3ccdb7-011e-4f41-9599-dab85489545e\") " pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:06:09 crc kubenswrapper[4867]: E1006 13:06:09.465244 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:09.965229378 +0000 UTC m=+149.423177522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.465861 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa3ccdb7-011e-4f41-9599-dab85489545e-utilities\") pod \"certified-operators-zzpjw\" (UID: \"fa3ccdb7-011e-4f41-9599-dab85489545e\") " pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.498736 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" event={"ID":"5b5edafd-513a-4a3a-bbc8-b027d6afda2c","Type":"ContainerStarted","Data":"776d3deb7e0d4b4a8cf7c3628ad3ee4d62eb3046b9bcc5b473fcb75723aaf2b8"} Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.498775 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" event={"ID":"5b5edafd-513a-4a3a-bbc8-b027d6afda2c","Type":"ContainerStarted","Data":"78bdcb2bf834edf8b38f3496a0d75270187a55cf41a5a85dc3168dcbfdb7eb07"} Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.512710 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn296\" (UniqueName: \"kubernetes.io/projected/fa3ccdb7-011e-4f41-9599-dab85489545e-kube-api-access-wn296\") pod \"certified-operators-zzpjw\" (UID: \"fa3ccdb7-011e-4f41-9599-dab85489545e\") " pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.513556 4867 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xj8lg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.513593 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" podUID="0e1c41fc-3e0d-4048-97d2-54c54bc065e5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.520755 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x8xqc"] Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.521904 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.554413 4867 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.567968 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-75xtb" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.568685 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b253d2fc-0c97-4110-8e96-f127181ff1ff-catalog-content\") pod \"community-operators-x8xqc\" (UID: \"b253d2fc-0c97-4110-8e96-f127181ff1ff\") " pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.568814 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.568862 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b253d2fc-0c97-4110-8e96-f127181ff1ff-utilities\") pod \"community-operators-x8xqc\" (UID: \"b253d2fc-0c97-4110-8e96-f127181ff1ff\") " pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.568952 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lq75\" (UniqueName: \"kubernetes.io/projected/b253d2fc-0c97-4110-8e96-f127181ff1ff-kube-api-access-9lq75\") pod \"community-operators-x8xqc\" (UID: \"b253d2fc-0c97-4110-8e96-f127181ff1ff\") " pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:09 crc kubenswrapper[4867]: E1006 13:06:09.570795 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:10.070776067 +0000 UTC m=+149.528724211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.621879 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x8xqc"] Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.659867 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.670881 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.671078 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b253d2fc-0c97-4110-8e96-f127181ff1ff-catalog-content\") pod \"community-operators-x8xqc\" (UID: \"b253d2fc-0c97-4110-8e96-f127181ff1ff\") " pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.671142 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b253d2fc-0c97-4110-8e96-f127181ff1ff-utilities\") pod \"community-operators-x8xqc\" (UID: \"b253d2fc-0c97-4110-8e96-f127181ff1ff\") " pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.671187 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lq75\" (UniqueName: \"kubernetes.io/projected/b253d2fc-0c97-4110-8e96-f127181ff1ff-kube-api-access-9lq75\") pod \"community-operators-x8xqc\" (UID: \"b253d2fc-0c97-4110-8e96-f127181ff1ff\") " pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:09 crc kubenswrapper[4867]: E1006 13:06:09.671685 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:10.171665703 +0000 UTC m=+149.629613847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.672162 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b253d2fc-0c97-4110-8e96-f127181ff1ff-catalog-content\") pod \"community-operators-x8xqc\" (UID: \"b253d2fc-0c97-4110-8e96-f127181ff1ff\") " pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.672486 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b253d2fc-0c97-4110-8e96-f127181ff1ff-utilities\") pod \"community-operators-x8xqc\" (UID: \"b253d2fc-0c97-4110-8e96-f127181ff1ff\") " pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.709277 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lq75\" (UniqueName: \"kubernetes.io/projected/b253d2fc-0c97-4110-8e96-f127181ff1ff-kube-api-access-9lq75\") pod \"community-operators-x8xqc\" (UID: \"b253d2fc-0c97-4110-8e96-f127181ff1ff\") " pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.718826 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hx8v4"] Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.722605 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.744327 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hx8v4"] Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.773280 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p69pz\" (UniqueName: \"kubernetes.io/projected/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-kube-api-access-p69pz\") pod \"certified-operators-hx8v4\" (UID: \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\") " pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.773414 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-catalog-content\") pod \"certified-operators-hx8v4\" (UID: \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\") " pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.773455 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-utilities\") pod \"certified-operators-hx8v4\" (UID: \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\") " pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.773514 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:09 crc kubenswrapper[4867]: E1006 13:06:09.773946 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:10.273932675 +0000 UTC m=+149.731880819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.840983 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:09 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:09 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:09 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.841040 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:09 crc kubenswrapper[4867]: W1006 13:06:09.858398 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-fb13b750c06961ebcc02a5f6569922f6ac4bd5e93025ddf4a7e77b796faadafc WatchSource:0}: Error finding container fb13b750c06961ebcc02a5f6569922f6ac4bd5e93025ddf4a7e77b796faadafc: Status 404 returned error can't find the container with id fb13b750c06961ebcc02a5f6569922f6ac4bd5e93025ddf4a7e77b796faadafc Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.874531 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:09 crc kubenswrapper[4867]: E1006 13:06:09.874718 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:10.374692488 +0000 UTC m=+149.832640632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.874865 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p69pz\" (UniqueName: \"kubernetes.io/projected/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-kube-api-access-p69pz\") pod \"certified-operators-hx8v4\" (UID: \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\") " pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.874940 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-catalog-content\") pod \"certified-operators-hx8v4\" (UID: \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\") " pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.874994 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-utilities\") pod \"certified-operators-hx8v4\" (UID: \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\") " pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.875014 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:09 crc kubenswrapper[4867]: E1006 13:06:09.875376 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:10.375364436 +0000 UTC m=+149.833312580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.876213 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-catalog-content\") pod \"certified-operators-hx8v4\" (UID: \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\") " pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.876565 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-utilities\") pod \"certified-operators-hx8v4\" (UID: \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\") " pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.902976 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.940478 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p69pz\" (UniqueName: \"kubernetes.io/projected/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-kube-api-access-p69pz\") pod \"certified-operators-hx8v4\" (UID: \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\") " pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.976466 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:09 crc kubenswrapper[4867]: E1006 13:06:09.978239 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:10.478191923 +0000 UTC m=+149.936140067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:09 crc kubenswrapper[4867]: I1006 13:06:09.999566 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:10 crc kubenswrapper[4867]: E1006 13:06:10.000366 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:10.500346954 +0000 UTC m=+149.958295108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.101013 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:10 crc kubenswrapper[4867]: E1006 13:06:10.101307 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:10.601289322 +0000 UTC m=+150.059237466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.105433 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.206231 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:10 crc kubenswrapper[4867]: E1006 13:06:10.208489 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 13:06:10.708469893 +0000 UTC m=+150.166418027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6h9ff" (UID: "f59db107-9767-4161-83f0-09f15ba1d881") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.212104 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cwk8q"] Oct 06 13:06:10 crc kubenswrapper[4867]: W1006 13:06:10.235201 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7971c0f6_6d96_4e00_9bda_003c54ad5d53.slice/crio-a51e9ce9297aea581ed6f1c66fd8989d1b039b0c503d8bd877419d771321a34b WatchSource:0}: Error finding container a51e9ce9297aea581ed6f1c66fd8989d1b039b0c503d8bd877419d771321a34b: Status 404 returned error can't find the container with id a51e9ce9297aea581ed6f1c66fd8989d1b039b0c503d8bd877419d771321a34b Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.310351 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:10 crc kubenswrapper[4867]: E1006 13:06:10.310750 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 13:06:10.810732215 +0000 UTC m=+150.268680359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.347139 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x8xqc"] Oct 06 13:06:10 crc kubenswrapper[4867]: W1006 13:06:10.357990 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb253d2fc_0c97_4110_8e96_f127181ff1ff.slice/crio-c0a24bbdc8dcb4e48b37476a97ed1f32e1ba23bcb86a0fb24ccb78a5fec5c052 WatchSource:0}: Error finding container c0a24bbdc8dcb4e48b37476a97ed1f32e1ba23bcb86a0fb24ccb78a5fec5c052: Status 404 returned error can't find the container with id c0a24bbdc8dcb4e48b37476a97ed1f32e1ba23bcb86a0fb24ccb78a5fec5c052 Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.402438 4867 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-06T13:06:09.554435828Z","Handler":null,"Name":""} Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.410118 4867 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.410165 4867 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.411602 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.414098 4867 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.414141 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.438372 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6h9ff\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.474233 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zzpjw"] Oct 06 13:06:10 crc kubenswrapper[4867]: W1006 13:06:10.477551 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa3ccdb7_011e_4f41_9599_dab85489545e.slice/crio-558117691be34bc3e81d4883b9c48e77ecbf0acc8ef608cce02cb58d1ab333d2 WatchSource:0}: Error finding container 558117691be34bc3e81d4883b9c48e77ecbf0acc8ef608cce02cb58d1ab333d2: Status 404 returned error can't find the container with id 558117691be34bc3e81d4883b9c48e77ecbf0acc8ef608cce02cb58d1ab333d2 Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.506886 4867 generic.go:334] "Generic (PLEG): container finished" podID="7971c0f6-6d96-4e00-9bda-003c54ad5d53" containerID="12e97de2de7b500d756a2dd9d05ef2d8e05b15cbc83730fd2c10dd4c2d0fdfb0" exitCode=0 Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.507018 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwk8q" event={"ID":"7971c0f6-6d96-4e00-9bda-003c54ad5d53","Type":"ContainerDied","Data":"12e97de2de7b500d756a2dd9d05ef2d8e05b15cbc83730fd2c10dd4c2d0fdfb0"} Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.507053 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwk8q" event={"ID":"7971c0f6-6d96-4e00-9bda-003c54ad5d53","Type":"ContainerStarted","Data":"a51e9ce9297aea581ed6f1c66fd8989d1b039b0c503d8bd877419d771321a34b"} Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.508764 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.508868 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5ee41bb26bf98547ee4b47e01a45a42248b40f393d9e7aa921a8e3eec8f676d3"} Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.508900 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"fb13b750c06961ebcc02a5f6569922f6ac4bd5e93025ddf4a7e77b796faadafc"} Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.512199 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.512201 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ceec0a876b7de5c431d46f334d8a05ee38103c42b5bea0234abbcaab18f5bed0"} Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.512553 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7a9fcfe8dcab9924f2a81912c0e1c77447a39dfca6652fd8157f05ee17966928"} Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.514691 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzpjw" event={"ID":"fa3ccdb7-011e-4f41-9599-dab85489545e","Type":"ContainerStarted","Data":"558117691be34bc3e81d4883b9c48e77ecbf0acc8ef608cce02cb58d1ab333d2"} Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.516570 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8xqc" event={"ID":"b253d2fc-0c97-4110-8e96-f127181ff1ff","Type":"ContainerStarted","Data":"c0a24bbdc8dcb4e48b37476a97ed1f32e1ba23bcb86a0fb24ccb78a5fec5c052"} Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.519212 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" event={"ID":"5b5edafd-513a-4a3a-bbc8-b027d6afda2c","Type":"ContainerStarted","Data":"1a3255c7b947327b628d8e1eaf12bf3ced96057086d40d3cfa12c6aab4f9f711"} Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.520756 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.523790 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5e6993951a6d9a7a4e90e51127bd79dea0b5bb893c01dac10196ad4a3ee9ef57"} Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.523820 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b51e9caa593506fc320df751fa42da175c287b5c98c5526ee357c41fb6a4745e"} Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.524125 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.551771 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-zpvzr" podStartSLOduration=11.551744696 podStartE2EDuration="11.551744696s" podCreationTimestamp="2025-10-06 13:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:10.5496097 +0000 UTC m=+150.007557864" watchObservedRunningTime="2025-10-06 13:06:10.551744696 +0000 UTC m=+150.009692840" Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.584474 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.617837 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hx8v4"] Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.812594 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6h9ff"] Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.838299 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:10 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:10 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:10 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:10 crc kubenswrapper[4867]: I1006 13:06:10.838359 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.030850 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.031698 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.034119 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.034146 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.038899 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.120482 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/376a730a-070b-44d8-8654-555b9c5925c0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"376a730a-070b-44d8-8654-555b9c5925c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.120702 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/376a730a-070b-44d8-8654-555b9c5925c0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"376a730a-070b-44d8-8654-555b9c5925c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.222160 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/376a730a-070b-44d8-8654-555b9c5925c0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"376a730a-070b-44d8-8654-555b9c5925c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.222226 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/376a730a-070b-44d8-8654-555b9c5925c0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"376a730a-070b-44d8-8654-555b9c5925c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.222353 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/376a730a-070b-44d8-8654-555b9c5925c0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"376a730a-070b-44d8-8654-555b9c5925c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.227519 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.252184 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/376a730a-070b-44d8-8654-555b9c5925c0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"376a730a-070b-44d8-8654-555b9c5925c0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.278842 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wv65z"] Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.280520 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.286993 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.303217 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv65z"] Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.357897 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.424615 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq9fb\" (UniqueName: \"kubernetes.io/projected/74e08994-05ef-40f2-903f-bb6450059e88-kube-api-access-nq9fb\") pod \"redhat-marketplace-wv65z\" (UID: \"74e08994-05ef-40f2-903f-bb6450059e88\") " pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.424671 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e08994-05ef-40f2-903f-bb6450059e88-utilities\") pod \"redhat-marketplace-wv65z\" (UID: \"74e08994-05ef-40f2-903f-bb6450059e88\") " pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.424734 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e08994-05ef-40f2-903f-bb6450059e88-catalog-content\") pod \"redhat-marketplace-wv65z\" (UID: \"74e08994-05ef-40f2-903f-bb6450059e88\") " pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.527074 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e08994-05ef-40f2-903f-bb6450059e88-catalog-content\") pod \"redhat-marketplace-wv65z\" (UID: \"74e08994-05ef-40f2-903f-bb6450059e88\") " pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.527478 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq9fb\" (UniqueName: \"kubernetes.io/projected/74e08994-05ef-40f2-903f-bb6450059e88-kube-api-access-nq9fb\") pod \"redhat-marketplace-wv65z\" (UID: \"74e08994-05ef-40f2-903f-bb6450059e88\") " pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.527498 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e08994-05ef-40f2-903f-bb6450059e88-utilities\") pod \"redhat-marketplace-wv65z\" (UID: \"74e08994-05ef-40f2-903f-bb6450059e88\") " pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.528039 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e08994-05ef-40f2-903f-bb6450059e88-catalog-content\") pod \"redhat-marketplace-wv65z\" (UID: \"74e08994-05ef-40f2-903f-bb6450059e88\") " pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.528278 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e08994-05ef-40f2-903f-bb6450059e88-utilities\") pod \"redhat-marketplace-wv65z\" (UID: \"74e08994-05ef-40f2-903f-bb6450059e88\") " pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.546524 4867 generic.go:334] "Generic (PLEG): container finished" podID="b253d2fc-0c97-4110-8e96-f127181ff1ff" containerID="1369df4560fdc0bc66ff67ef9ce01c7cf6b4ac499a17f45a85fff28311ac17d9" exitCode=0 Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.546611 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8xqc" event={"ID":"b253d2fc-0c97-4110-8e96-f127181ff1ff","Type":"ContainerDied","Data":"1369df4560fdc0bc66ff67ef9ce01c7cf6b4ac499a17f45a85fff28311ac17d9"} Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.559604 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq9fb\" (UniqueName: \"kubernetes.io/projected/74e08994-05ef-40f2-903f-bb6450059e88-kube-api-access-nq9fb\") pod \"redhat-marketplace-wv65z\" (UID: \"74e08994-05ef-40f2-903f-bb6450059e88\") " pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.561879 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" event={"ID":"f59db107-9767-4161-83f0-09f15ba1d881","Type":"ContainerStarted","Data":"381c14d5ecff074040321b307677fa94bc9902b1f3f3a74f731be0c62b457b17"} Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.561913 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" event={"ID":"f59db107-9767-4161-83f0-09f15ba1d881","Type":"ContainerStarted","Data":"3f315cf3fa3fd694b6531a4cd862624df2949c49733127b4632af12d4e577d9c"} Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.562464 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.569297 4867 generic.go:334] "Generic (PLEG): container finished" podID="e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" containerID="00f4c2cd9cb4237305267ff488e2f690bdfe8b71106491db12e5ad17307dfa08" exitCode=0 Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.569503 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx8v4" event={"ID":"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d","Type":"ContainerDied","Data":"00f4c2cd9cb4237305267ff488e2f690bdfe8b71106491db12e5ad17307dfa08"} Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.569559 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx8v4" event={"ID":"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d","Type":"ContainerStarted","Data":"1c5ba34e65b009ff7566376c24cd1c2c8b5dd23ad07a64bda7ffc868884eee22"} Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.572805 4867 generic.go:334] "Generic (PLEG): container finished" podID="fa3ccdb7-011e-4f41-9599-dab85489545e" containerID="8c2244dbf2ffee738e3ae4d3d432d3b9b26ad407dc0fcba2b29316bf9f351f07" exitCode=0 Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.572964 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzpjw" event={"ID":"fa3ccdb7-011e-4f41-9599-dab85489545e","Type":"ContainerDied","Data":"8c2244dbf2ffee738e3ae4d3d432d3b9b26ad407dc0fcba2b29316bf9f351f07"} Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.583008 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" podStartSLOduration=130.582990715 podStartE2EDuration="2m10.582990715s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:11.582791169 +0000 UTC m=+151.040739313" watchObservedRunningTime="2025-10-06 13:06:11.582990715 +0000 UTC m=+151.040938859" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.599605 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.678818 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6djpm"] Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.680499 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.686495 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6djpm"] Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.791120 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.815017 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv65z"] Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.820383 4867 patch_prober.go:28] interesting pod/downloads-7954f5f757-x4d8v container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.820428 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-x4d8v" podUID="8826a928-e7d1-4cb1-bd08-69849ee5a12b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.820466 4867 patch_prober.go:28] interesting pod/downloads-7954f5f757-x4d8v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.820538 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-x4d8v" podUID="8826a928-e7d1-4cb1-bd08-69849ee5a12b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.841084 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfbw5\" (UniqueName: \"kubernetes.io/projected/c8449de1-c234-45f5-a6dc-e1e300785924-kube-api-access-mfbw5\") pod \"redhat-marketplace-6djpm\" (UID: \"c8449de1-c234-45f5-a6dc-e1e300785924\") " pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.841210 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8449de1-c234-45f5-a6dc-e1e300785924-utilities\") pod \"redhat-marketplace-6djpm\" (UID: \"c8449de1-c234-45f5-a6dc-e1e300785924\") " pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.841236 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8449de1-c234-45f5-a6dc-e1e300785924-catalog-content\") pod \"redhat-marketplace-6djpm\" (UID: \"c8449de1-c234-45f5-a6dc-e1e300785924\") " pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.841842 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:11 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:11 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:11 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.841868 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.933508 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.933861 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.941346 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.942274 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8449de1-c234-45f5-a6dc-e1e300785924-utilities\") pod \"redhat-marketplace-6djpm\" (UID: \"c8449de1-c234-45f5-a6dc-e1e300785924\") " pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.942346 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8449de1-c234-45f5-a6dc-e1e300785924-catalog-content\") pod \"redhat-marketplace-6djpm\" (UID: \"c8449de1-c234-45f5-a6dc-e1e300785924\") " pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.942427 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfbw5\" (UniqueName: \"kubernetes.io/projected/c8449de1-c234-45f5-a6dc-e1e300785924-kube-api-access-mfbw5\") pod \"redhat-marketplace-6djpm\" (UID: \"c8449de1-c234-45f5-a6dc-e1e300785924\") " pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.943410 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8449de1-c234-45f5-a6dc-e1e300785924-utilities\") pod \"redhat-marketplace-6djpm\" (UID: \"c8449de1-c234-45f5-a6dc-e1e300785924\") " pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.943550 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8449de1-c234-45f5-a6dc-e1e300785924-catalog-content\") pod \"redhat-marketplace-6djpm\" (UID: \"c8449de1-c234-45f5-a6dc-e1e300785924\") " pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:06:11 crc kubenswrapper[4867]: I1006 13:06:11.979601 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfbw5\" (UniqueName: \"kubernetes.io/projected/c8449de1-c234-45f5-a6dc-e1e300785924-kube-api-access-mfbw5\") pod \"redhat-marketplace-6djpm\" (UID: \"c8449de1-c234-45f5-a6dc-e1e300785924\") " pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.010654 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.265796 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6djpm"] Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.285564 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-65gkn"] Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.286930 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.290441 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.297136 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-65gkn"] Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.454584 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k994s\" (UniqueName: \"kubernetes.io/projected/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-kube-api-access-k994s\") pod \"redhat-operators-65gkn\" (UID: \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\") " pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.454673 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-utilities\") pod \"redhat-operators-65gkn\" (UID: \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\") " pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.454698 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-catalog-content\") pod \"redhat-operators-65gkn\" (UID: \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\") " pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.538847 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.538911 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.541806 4867 patch_prober.go:28] interesting pod/console-f9d7485db-rqnc4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.541882 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rqnc4" podUID="a85dd45a-f972-4cb7-aa77-e2f8468df1cf" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.556225 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k994s\" (UniqueName: \"kubernetes.io/projected/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-kube-api-access-k994s\") pod \"redhat-operators-65gkn\" (UID: \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\") " pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.556318 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-utilities\") pod \"redhat-operators-65gkn\" (UID: \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\") " pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.556350 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-catalog-content\") pod \"redhat-operators-65gkn\" (UID: \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\") " pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.557187 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-catalog-content\") pod \"redhat-operators-65gkn\" (UID: \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\") " pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.557689 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-utilities\") pod \"redhat-operators-65gkn\" (UID: \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\") " pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.577851 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k994s\" (UniqueName: \"kubernetes.io/projected/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-kube-api-access-k994s\") pod \"redhat-operators-65gkn\" (UID: \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\") " pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.582269 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.582916 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.586768 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.586964 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.597568 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.598072 4867 generic.go:334] "Generic (PLEG): container finished" podID="c8449de1-c234-45f5-a6dc-e1e300785924" containerID="f71809e7d75a9e7dbc7185bf146befffdd8601c3cbb9a7831d17a2bd7faa054b" exitCode=0 Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.598143 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6djpm" event={"ID":"c8449de1-c234-45f5-a6dc-e1e300785924","Type":"ContainerDied","Data":"f71809e7d75a9e7dbc7185bf146befffdd8601c3cbb9a7831d17a2bd7faa054b"} Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.598165 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6djpm" event={"ID":"c8449de1-c234-45f5-a6dc-e1e300785924","Type":"ContainerStarted","Data":"b6df1b9e00f091f33a7789243500b1e472e6eab42dfe98dcb2e8f5a3f26344ec"} Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.615009 4867 generic.go:334] "Generic (PLEG): container finished" podID="74e08994-05ef-40f2-903f-bb6450059e88" containerID="493d3d01e1c0cba5cf3400fa9847f94517df07b68ed8f80ea9eb328ef056d47c" exitCode=0 Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.615129 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv65z" event={"ID":"74e08994-05ef-40f2-903f-bb6450059e88","Type":"ContainerDied","Data":"493d3d01e1c0cba5cf3400fa9847f94517df07b68ed8f80ea9eb328ef056d47c"} Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.615157 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv65z" event={"ID":"74e08994-05ef-40f2-903f-bb6450059e88","Type":"ContainerStarted","Data":"f1d92ee0377ee4b3431a4880c5f3d13499881bb55ac109b1542deefe69b70cf8"} Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.622278 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.630635 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"376a730a-070b-44d8-8654-555b9c5925c0","Type":"ContainerStarted","Data":"cb6c4b966a43ba562faad6d46246711da0e198e58365385377db45c5f390ad11"} Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.630695 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"376a730a-070b-44d8-8654-555b9c5925c0","Type":"ContainerStarted","Data":"279bcb1febbf4d59fb04b849660a7fa16b83a425fa21754ed55dfeb21e175831"} Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.638617 4867 generic.go:334] "Generic (PLEG): container finished" podID="598298ab-238a-4776-9e23-e66c273dc805" containerID="c72ea94e655d936e618651192f8799ebba998eb0cb6590bad3bc3db720987234" exitCode=0 Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.639718 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" event={"ID":"598298ab-238a-4776-9e23-e66c273dc805","Type":"ContainerDied","Data":"c72ea94e655d936e618651192f8799ebba998eb0cb6590bad3bc3db720987234"} Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.647776 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.652845 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-pklsp" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.657877 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91dee131-1a24-40d0-9e1d-b9e4e3e2906e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"91dee131-1a24-40d0-9e1d-b9e4e3e2906e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.657917 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91dee131-1a24-40d0-9e1d-b9e4e3e2906e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"91dee131-1a24-40d0-9e1d-b9e4e3e2906e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.664753 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nplc7" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.684333 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d98tk"] Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.685358 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.712503 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d98tk"] Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.731382 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.731357516 podStartE2EDuration="1.731357516s" podCreationTimestamp="2025-10-06 13:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:12.726535099 +0000 UTC m=+152.184483243" watchObservedRunningTime="2025-10-06 13:06:12.731357516 +0000 UTC m=+152.189305670" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.771563 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g225c\" (UniqueName: \"kubernetes.io/projected/ae5cd503-d6b1-4056-a0b8-b302a68d404b-kube-api-access-g225c\") pod \"redhat-operators-d98tk\" (UID: \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\") " pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.771681 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae5cd503-d6b1-4056-a0b8-b302a68d404b-catalog-content\") pod \"redhat-operators-d98tk\" (UID: \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\") " pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.771701 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae5cd503-d6b1-4056-a0b8-b302a68d404b-utilities\") pod \"redhat-operators-d98tk\" (UID: \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\") " pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.771801 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91dee131-1a24-40d0-9e1d-b9e4e3e2906e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"91dee131-1a24-40d0-9e1d-b9e4e3e2906e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.771821 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91dee131-1a24-40d0-9e1d-b9e4e3e2906e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"91dee131-1a24-40d0-9e1d-b9e4e3e2906e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.773235 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91dee131-1a24-40d0-9e1d-b9e4e3e2906e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"91dee131-1a24-40d0-9e1d-b9e4e3e2906e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.835725 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.850056 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:12 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:12 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:12 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.850144 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.853484 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91dee131-1a24-40d0-9e1d-b9e4e3e2906e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"91dee131-1a24-40d0-9e1d-b9e4e3e2906e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.873961 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g225c\" (UniqueName: \"kubernetes.io/projected/ae5cd503-d6b1-4056-a0b8-b302a68d404b-kube-api-access-g225c\") pod \"redhat-operators-d98tk\" (UID: \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\") " pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.874065 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae5cd503-d6b1-4056-a0b8-b302a68d404b-catalog-content\") pod \"redhat-operators-d98tk\" (UID: \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\") " pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.874086 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae5cd503-d6b1-4056-a0b8-b302a68d404b-utilities\") pod \"redhat-operators-d98tk\" (UID: \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\") " pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.874619 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae5cd503-d6b1-4056-a0b8-b302a68d404b-utilities\") pod \"redhat-operators-d98tk\" (UID: \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\") " pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.875234 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae5cd503-d6b1-4056-a0b8-b302a68d404b-catalog-content\") pod \"redhat-operators-d98tk\" (UID: \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\") " pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.875466 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.875517 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.907402 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g225c\" (UniqueName: \"kubernetes.io/projected/ae5cd503-d6b1-4056-a0b8-b302a68d404b-kube-api-access-g225c\") pod \"redhat-operators-d98tk\" (UID: \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\") " pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:12 crc kubenswrapper[4867]: I1006 13:06:12.910482 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 13:06:13 crc kubenswrapper[4867]: I1006 13:06:13.017274 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:13 crc kubenswrapper[4867]: I1006 13:06:13.274595 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:06:13 crc kubenswrapper[4867]: I1006 13:06:13.331263 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-65gkn"] Oct 06 13:06:13 crc kubenswrapper[4867]: W1006 13:06:13.385898 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d2f5af5_a716_4878_ac8b_81c7636ffd7e.slice/crio-abb6e6f8c24a73597c6bb46f2350cb0c82ec68e2f9d62375c5af34d4f3e94f00 WatchSource:0}: Error finding container abb6e6f8c24a73597c6bb46f2350cb0c82ec68e2f9d62375c5af34d4f3e94f00: Status 404 returned error can't find the container with id abb6e6f8c24a73597c6bb46f2350cb0c82ec68e2f9d62375c5af34d4f3e94f00 Oct 06 13:06:13 crc kubenswrapper[4867]: I1006 13:06:13.462823 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d98tk"] Oct 06 13:06:13 crc kubenswrapper[4867]: W1006 13:06:13.485948 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae5cd503_d6b1_4056_a0b8_b302a68d404b.slice/crio-ef209bc8d9bbb429f7c6dd113bbe367f7f445b5f7860aafc741a7c6325a89445 WatchSource:0}: Error finding container ef209bc8d9bbb429f7c6dd113bbe367f7f445b5f7860aafc741a7c6325a89445: Status 404 returned error can't find the container with id ef209bc8d9bbb429f7c6dd113bbe367f7f445b5f7860aafc741a7c6325a89445 Oct 06 13:06:13 crc kubenswrapper[4867]: I1006 13:06:13.614339 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 13:06:13 crc kubenswrapper[4867]: W1006 13:06:13.631825 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod91dee131_1a24_40d0_9e1d_b9e4e3e2906e.slice/crio-5b82bff8e1adacc14ebfaee46c21e8704e0be27de7564944d4818e51d0de3b77 WatchSource:0}: Error finding container 5b82bff8e1adacc14ebfaee46c21e8704e0be27de7564944d4818e51d0de3b77: Status 404 returned error can't find the container with id 5b82bff8e1adacc14ebfaee46c21e8704e0be27de7564944d4818e51d0de3b77 Oct 06 13:06:13 crc kubenswrapper[4867]: I1006 13:06:13.652798 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65gkn" event={"ID":"4d2f5af5-a716-4878-ac8b-81c7636ffd7e","Type":"ContainerStarted","Data":"abb6e6f8c24a73597c6bb46f2350cb0c82ec68e2f9d62375c5af34d4f3e94f00"} Oct 06 13:06:13 crc kubenswrapper[4867]: I1006 13:06:13.655934 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"91dee131-1a24-40d0-9e1d-b9e4e3e2906e","Type":"ContainerStarted","Data":"5b82bff8e1adacc14ebfaee46c21e8704e0be27de7564944d4818e51d0de3b77"} Oct 06 13:06:13 crc kubenswrapper[4867]: I1006 13:06:13.659071 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d98tk" event={"ID":"ae5cd503-d6b1-4056-a0b8-b302a68d404b","Type":"ContainerStarted","Data":"ef209bc8d9bbb429f7c6dd113bbe367f7f445b5f7860aafc741a7c6325a89445"} Oct 06 13:06:13 crc kubenswrapper[4867]: I1006 13:06:13.668779 4867 generic.go:334] "Generic (PLEG): container finished" podID="376a730a-070b-44d8-8654-555b9c5925c0" containerID="cb6c4b966a43ba562faad6d46246711da0e198e58365385377db45c5f390ad11" exitCode=0 Oct 06 13:06:13 crc kubenswrapper[4867]: I1006 13:06:13.668957 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"376a730a-070b-44d8-8654-555b9c5925c0","Type":"ContainerDied","Data":"cb6c4b966a43ba562faad6d46246711da0e198e58365385377db45c5f390ad11"} Oct 06 13:06:13 crc kubenswrapper[4867]: I1006 13:06:13.838368 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:13 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:13 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:13 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:13 crc kubenswrapper[4867]: I1006 13:06:13.838429 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:13 crc kubenswrapper[4867]: I1006 13:06:13.957973 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.009403 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/598298ab-238a-4776-9e23-e66c273dc805-config-volume\") pod \"598298ab-238a-4776-9e23-e66c273dc805\" (UID: \"598298ab-238a-4776-9e23-e66c273dc805\") " Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.009456 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2n7s\" (UniqueName: \"kubernetes.io/projected/598298ab-238a-4776-9e23-e66c273dc805-kube-api-access-v2n7s\") pod \"598298ab-238a-4776-9e23-e66c273dc805\" (UID: \"598298ab-238a-4776-9e23-e66c273dc805\") " Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.009482 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/598298ab-238a-4776-9e23-e66c273dc805-secret-volume\") pod \"598298ab-238a-4776-9e23-e66c273dc805\" (UID: \"598298ab-238a-4776-9e23-e66c273dc805\") " Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.010188 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/598298ab-238a-4776-9e23-e66c273dc805-config-volume" (OuterVolumeSpecName: "config-volume") pod "598298ab-238a-4776-9e23-e66c273dc805" (UID: "598298ab-238a-4776-9e23-e66c273dc805"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.012054 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/598298ab-238a-4776-9e23-e66c273dc805-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.019340 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598298ab-238a-4776-9e23-e66c273dc805-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "598298ab-238a-4776-9e23-e66c273dc805" (UID: "598298ab-238a-4776-9e23-e66c273dc805"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.019714 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598298ab-238a-4776-9e23-e66c273dc805-kube-api-access-v2n7s" (OuterVolumeSpecName: "kube-api-access-v2n7s") pod "598298ab-238a-4776-9e23-e66c273dc805" (UID: "598298ab-238a-4776-9e23-e66c273dc805"). InnerVolumeSpecName "kube-api-access-v2n7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.113767 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2n7s\" (UniqueName: \"kubernetes.io/projected/598298ab-238a-4776-9e23-e66c273dc805-kube-api-access-v2n7s\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.113815 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/598298ab-238a-4776-9e23-e66c273dc805-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.677321 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"91dee131-1a24-40d0-9e1d-b9e4e3e2906e","Type":"ContainerStarted","Data":"a50f99cd2caf4024e7898b7d3c25420b303f6292ee7a4c8c8010e415d1eac229"} Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.685104 4867 generic.go:334] "Generic (PLEG): container finished" podID="ae5cd503-d6b1-4056-a0b8-b302a68d404b" containerID="a48d0133b5ea33215232539d0bf1785e93ca16402382f78b99221d15801fb567" exitCode=0 Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.685198 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d98tk" event={"ID":"ae5cd503-d6b1-4056-a0b8-b302a68d404b","Type":"ContainerDied","Data":"a48d0133b5ea33215232539d0bf1785e93ca16402382f78b99221d15801fb567"} Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.697050 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.697040 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc" event={"ID":"598298ab-238a-4776-9e23-e66c273dc805","Type":"ContainerDied","Data":"f839f62f8796c1696c1090e47f4b5596529cf9e255a65862017e07880a0fd188"} Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.697270 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f839f62f8796c1696c1090e47f4b5596529cf9e255a65862017e07880a0fd188" Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.699121 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.699093617 podStartE2EDuration="2.699093617s" podCreationTimestamp="2025-10-06 13:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:14.698509432 +0000 UTC m=+154.156457576" watchObservedRunningTime="2025-10-06 13:06:14.699093617 +0000 UTC m=+154.157041761" Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.705387 4867 generic.go:334] "Generic (PLEG): container finished" podID="4d2f5af5-a716-4878-ac8b-81c7636ffd7e" containerID="b3fc6dbadd3586c49ab59c12d855fbdaa1f28f8dee0dbc4a461064f9779bb955" exitCode=0 Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.705473 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65gkn" event={"ID":"4d2f5af5-a716-4878-ac8b-81c7636ffd7e","Type":"ContainerDied","Data":"b3fc6dbadd3586c49ab59c12d855fbdaa1f28f8dee0dbc4a461064f9779bb955"} Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.844378 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:14 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:14 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:14 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.845226 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:14 crc kubenswrapper[4867]: I1006 13:06:14.951236 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 13:06:15 crc kubenswrapper[4867]: I1006 13:06:15.034980 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/376a730a-070b-44d8-8654-555b9c5925c0-kubelet-dir\") pod \"376a730a-070b-44d8-8654-555b9c5925c0\" (UID: \"376a730a-070b-44d8-8654-555b9c5925c0\") " Oct 06 13:06:15 crc kubenswrapper[4867]: I1006 13:06:15.035208 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/376a730a-070b-44d8-8654-555b9c5925c0-kube-api-access\") pod \"376a730a-070b-44d8-8654-555b9c5925c0\" (UID: \"376a730a-070b-44d8-8654-555b9c5925c0\") " Oct 06 13:06:15 crc kubenswrapper[4867]: I1006 13:06:15.035377 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/376a730a-070b-44d8-8654-555b9c5925c0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "376a730a-070b-44d8-8654-555b9c5925c0" (UID: "376a730a-070b-44d8-8654-555b9c5925c0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:06:15 crc kubenswrapper[4867]: I1006 13:06:15.037565 4867 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/376a730a-070b-44d8-8654-555b9c5925c0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:15 crc kubenswrapper[4867]: I1006 13:06:15.059607 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/376a730a-070b-44d8-8654-555b9c5925c0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "376a730a-070b-44d8-8654-555b9c5925c0" (UID: "376a730a-070b-44d8-8654-555b9c5925c0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:06:15 crc kubenswrapper[4867]: I1006 13:06:15.138979 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/376a730a-070b-44d8-8654-555b9c5925c0-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:15 crc kubenswrapper[4867]: I1006 13:06:15.737103 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"376a730a-070b-44d8-8654-555b9c5925c0","Type":"ContainerDied","Data":"279bcb1febbf4d59fb04b849660a7fa16b83a425fa21754ed55dfeb21e175831"} Oct 06 13:06:15 crc kubenswrapper[4867]: I1006 13:06:15.737166 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="279bcb1febbf4d59fb04b849660a7fa16b83a425fa21754ed55dfeb21e175831" Oct 06 13:06:15 crc kubenswrapper[4867]: I1006 13:06:15.737222 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 13:06:15 crc kubenswrapper[4867]: I1006 13:06:15.746673 4867 generic.go:334] "Generic (PLEG): container finished" podID="91dee131-1a24-40d0-9e1d-b9e4e3e2906e" containerID="a50f99cd2caf4024e7898b7d3c25420b303f6292ee7a4c8c8010e415d1eac229" exitCode=0 Oct 06 13:06:15 crc kubenswrapper[4867]: I1006 13:06:15.746711 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"91dee131-1a24-40d0-9e1d-b9e4e3e2906e","Type":"ContainerDied","Data":"a50f99cd2caf4024e7898b7d3c25420b303f6292ee7a4c8c8010e415d1eac229"} Oct 06 13:06:15 crc kubenswrapper[4867]: I1006 13:06:15.853147 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:15 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:15 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:15 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:15 crc kubenswrapper[4867]: I1006 13:06:15.853203 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:16 crc kubenswrapper[4867]: I1006 13:06:16.767797 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:06:16 crc kubenswrapper[4867]: I1006 13:06:16.838601 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:16 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:16 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:16 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:16 crc kubenswrapper[4867]: I1006 13:06:16.838694 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:17 crc kubenswrapper[4867]: I1006 13:06:17.840798 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:17 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:17 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:17 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:17 crc kubenswrapper[4867]: I1006 13:06:17.841306 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:17 crc kubenswrapper[4867]: I1006 13:06:17.982341 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4k5tp" Oct 06 13:06:18 crc kubenswrapper[4867]: I1006 13:06:18.838761 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:18 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:18 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:18 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:18 crc kubenswrapper[4867]: I1006 13:06:18.838855 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:19 crc kubenswrapper[4867]: I1006 13:06:19.837865 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:19 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:19 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:19 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:19 crc kubenswrapper[4867]: I1006 13:06:19.837943 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:20 crc kubenswrapper[4867]: I1006 13:06:20.837654 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:20 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:20 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:20 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:20 crc kubenswrapper[4867]: I1006 13:06:20.837973 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:21 crc kubenswrapper[4867]: I1006 13:06:21.665699 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 13:06:21 crc kubenswrapper[4867]: I1006 13:06:21.754371 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91dee131-1a24-40d0-9e1d-b9e4e3e2906e-kubelet-dir\") pod \"91dee131-1a24-40d0-9e1d-b9e4e3e2906e\" (UID: \"91dee131-1a24-40d0-9e1d-b9e4e3e2906e\") " Oct 06 13:06:21 crc kubenswrapper[4867]: I1006 13:06:21.754496 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91dee131-1a24-40d0-9e1d-b9e4e3e2906e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "91dee131-1a24-40d0-9e1d-b9e4e3e2906e" (UID: "91dee131-1a24-40d0-9e1d-b9e4e3e2906e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:06:21 crc kubenswrapper[4867]: I1006 13:06:21.754629 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91dee131-1a24-40d0-9e1d-b9e4e3e2906e-kube-api-access\") pod \"91dee131-1a24-40d0-9e1d-b9e4e3e2906e\" (UID: \"91dee131-1a24-40d0-9e1d-b9e4e3e2906e\") " Oct 06 13:06:21 crc kubenswrapper[4867]: I1006 13:06:21.754873 4867 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91dee131-1a24-40d0-9e1d-b9e4e3e2906e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:21 crc kubenswrapper[4867]: I1006 13:06:21.773947 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91dee131-1a24-40d0-9e1d-b9e4e3e2906e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "91dee131-1a24-40d0-9e1d-b9e4e3e2906e" (UID: "91dee131-1a24-40d0-9e1d-b9e4e3e2906e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:06:21 crc kubenswrapper[4867]: I1006 13:06:21.825603 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-x4d8v" Oct 06 13:06:21 crc kubenswrapper[4867]: I1006 13:06:21.837318 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:21 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:21 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:21 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:21 crc kubenswrapper[4867]: I1006 13:06:21.838786 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:21 crc kubenswrapper[4867]: I1006 13:06:21.856828 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91dee131-1a24-40d0-9e1d-b9e4e3e2906e-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:21 crc kubenswrapper[4867]: I1006 13:06:21.864208 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"91dee131-1a24-40d0-9e1d-b9e4e3e2906e","Type":"ContainerDied","Data":"5b82bff8e1adacc14ebfaee46c21e8704e0be27de7564944d4818e51d0de3b77"} Oct 06 13:06:21 crc kubenswrapper[4867]: I1006 13:06:21.864267 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b82bff8e1adacc14ebfaee46c21e8704e0be27de7564944d4818e51d0de3b77" Oct 06 13:06:21 crc kubenswrapper[4867]: I1006 13:06:21.864328 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 13:06:22 crc kubenswrapper[4867]: I1006 13:06:22.539209 4867 patch_prober.go:28] interesting pod/console-f9d7485db-rqnc4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Oct 06 13:06:22 crc kubenswrapper[4867]: I1006 13:06:22.539288 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rqnc4" podUID="a85dd45a-f972-4cb7-aa77-e2f8468df1cf" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Oct 06 13:06:22 crc kubenswrapper[4867]: I1006 13:06:22.836918 4867 patch_prober.go:28] interesting pod/router-default-5444994796-svjt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 13:06:22 crc kubenswrapper[4867]: [-]has-synced failed: reason withheld Oct 06 13:06:22 crc kubenswrapper[4867]: [+]process-running ok Oct 06 13:06:22 crc kubenswrapper[4867]: healthz check failed Oct 06 13:06:22 crc kubenswrapper[4867]: I1006 13:06:22.837011 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svjt7" podUID="dc7ecd08-018a-482c-be65-b08bdcbf2ed6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 13:06:23 crc kubenswrapper[4867]: I1006 13:06:23.375946 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs\") pod \"network-metrics-daemon-8t2sq\" (UID: \"b78c9415-85bd-40db-b44f-f1e04797a66e\") " pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:06:23 crc kubenswrapper[4867]: I1006 13:06:23.379861 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b78c9415-85bd-40db-b44f-f1e04797a66e-metrics-certs\") pod \"network-metrics-daemon-8t2sq\" (UID: \"b78c9415-85bd-40db-b44f-f1e04797a66e\") " pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:06:23 crc kubenswrapper[4867]: I1006 13:06:23.433788 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8t2sq" Oct 06 13:06:23 crc kubenswrapper[4867]: I1006 13:06:23.843939 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:23 crc kubenswrapper[4867]: I1006 13:06:23.846617 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-svjt7" Oct 06 13:06:30 crc kubenswrapper[4867]: I1006 13:06:30.590132 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:06:32 crc kubenswrapper[4867]: I1006 13:06:32.546373 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:32 crc kubenswrapper[4867]: I1006 13:06:32.552409 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:06:35 crc kubenswrapper[4867]: E1006 13:06:35.787756 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 06 13:06:35 crc kubenswrapper[4867]: E1006 13:06:35.788040 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p69pz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hx8v4_openshift-marketplace(e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 13:06:35 crc kubenswrapper[4867]: E1006 13:06:35.789288 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hx8v4" podUID="e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" Oct 06 13:06:37 crc kubenswrapper[4867]: E1006 13:06:37.090398 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hx8v4" podUID="e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" Oct 06 13:06:37 crc kubenswrapper[4867]: E1006 13:06:37.211435 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 06 13:06:37 crc kubenswrapper[4867]: E1006 13:06:37.211641 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wn296,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zzpjw_openshift-marketplace(fa3ccdb7-011e-4f41-9599-dab85489545e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 13:06:37 crc kubenswrapper[4867]: E1006 13:06:37.213072 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zzpjw" podUID="fa3ccdb7-011e-4f41-9599-dab85489545e" Oct 06 13:06:39 crc kubenswrapper[4867]: E1006 13:06:39.361428 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zzpjw" podUID="fa3ccdb7-011e-4f41-9599-dab85489545e" Oct 06 13:06:41 crc kubenswrapper[4867]: E1006 13:06:41.107574 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 06 13:06:41 crc kubenswrapper[4867]: E1006 13:06:41.108151 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nq9fb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wv65z_openshift-marketplace(74e08994-05ef-40f2-903f-bb6450059e88): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 13:06:41 crc kubenswrapper[4867]: E1006 13:06:41.109459 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wv65z" podUID="74e08994-05ef-40f2-903f-bb6450059e88" Oct 06 13:06:41 crc kubenswrapper[4867]: E1006 13:06:41.150780 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 06 13:06:41 crc kubenswrapper[4867]: E1006 13:06:41.150964 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mfbw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6djpm_openshift-marketplace(c8449de1-c234-45f5-a6dc-e1e300785924): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 13:06:41 crc kubenswrapper[4867]: E1006 13:06:41.155343 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6djpm" podUID="c8449de1-c234-45f5-a6dc-e1e300785924" Oct 06 13:06:41 crc kubenswrapper[4867]: I1006 13:06:41.495945 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8t2sq"] Oct 06 13:06:41 crc kubenswrapper[4867]: W1006 13:06:41.501065 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb78c9415_85bd_40db_b44f_f1e04797a66e.slice/crio-414ae372b011d70cf6b39fe11cdf7519ab9e2ab49b5d6adec66dd252bcee9853 WatchSource:0}: Error finding container 414ae372b011d70cf6b39fe11cdf7519ab9e2ab49b5d6adec66dd252bcee9853: Status 404 returned error can't find the container with id 414ae372b011d70cf6b39fe11cdf7519ab9e2ab49b5d6adec66dd252bcee9853 Oct 06 13:06:41 crc kubenswrapper[4867]: I1006 13:06:41.974735 4867 generic.go:334] "Generic (PLEG): container finished" podID="4d2f5af5-a716-4878-ac8b-81c7636ffd7e" containerID="6b74cf990b57afb9f1a8df3aa59c58818066ea486002c47509996dd0dd430e58" exitCode=0 Oct 06 13:06:41 crc kubenswrapper[4867]: I1006 13:06:41.974820 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65gkn" event={"ID":"4d2f5af5-a716-4878-ac8b-81c7636ffd7e","Type":"ContainerDied","Data":"6b74cf990b57afb9f1a8df3aa59c58818066ea486002c47509996dd0dd430e58"} Oct 06 13:06:41 crc kubenswrapper[4867]: I1006 13:06:41.977853 4867 generic.go:334] "Generic (PLEG): container finished" podID="7971c0f6-6d96-4e00-9bda-003c54ad5d53" containerID="43dc9e0893f1e942f25577ea72894058d44bc3b0f537f19ddf1d0f3ea5aaff6d" exitCode=0 Oct 06 13:06:41 crc kubenswrapper[4867]: I1006 13:06:41.977939 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwk8q" event={"ID":"7971c0f6-6d96-4e00-9bda-003c54ad5d53","Type":"ContainerDied","Data":"43dc9e0893f1e942f25577ea72894058d44bc3b0f537f19ddf1d0f3ea5aaff6d"} Oct 06 13:06:41 crc kubenswrapper[4867]: I1006 13:06:41.984426 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" event={"ID":"b78c9415-85bd-40db-b44f-f1e04797a66e","Type":"ContainerStarted","Data":"a00de68f0438d450a31e12977328acbaa01989a60325eaa79693550d7e11cd31"} Oct 06 13:06:41 crc kubenswrapper[4867]: I1006 13:06:41.984480 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" event={"ID":"b78c9415-85bd-40db-b44f-f1e04797a66e","Type":"ContainerStarted","Data":"414ae372b011d70cf6b39fe11cdf7519ab9e2ab49b5d6adec66dd252bcee9853"} Oct 06 13:06:41 crc kubenswrapper[4867]: I1006 13:06:41.988490 4867 generic.go:334] "Generic (PLEG): container finished" podID="ae5cd503-d6b1-4056-a0b8-b302a68d404b" containerID="7885e903dd569b4886eb2b3f6fcc9ea49f5d227b2a54dd2131265216f4184c75" exitCode=0 Oct 06 13:06:41 crc kubenswrapper[4867]: I1006 13:06:41.988570 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d98tk" event={"ID":"ae5cd503-d6b1-4056-a0b8-b302a68d404b","Type":"ContainerDied","Data":"7885e903dd569b4886eb2b3f6fcc9ea49f5d227b2a54dd2131265216f4184c75"} Oct 06 13:06:41 crc kubenswrapper[4867]: I1006 13:06:41.990548 4867 generic.go:334] "Generic (PLEG): container finished" podID="b253d2fc-0c97-4110-8e96-f127181ff1ff" containerID="3737cf805e43f6e4c78d0371583f1533fe38ce51b15aba234144f7ab193fd801" exitCode=0 Oct 06 13:06:41 crc kubenswrapper[4867]: I1006 13:06:41.992168 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8xqc" event={"ID":"b253d2fc-0c97-4110-8e96-f127181ff1ff","Type":"ContainerDied","Data":"3737cf805e43f6e4c78d0371583f1533fe38ce51b15aba234144f7ab193fd801"} Oct 06 13:06:41 crc kubenswrapper[4867]: E1006 13:06:41.997962 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6djpm" podUID="c8449de1-c234-45f5-a6dc-e1e300785924" Oct 06 13:06:41 crc kubenswrapper[4867]: E1006 13:06:41.998457 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wv65z" podUID="74e08994-05ef-40f2-903f-bb6450059e88" Oct 06 13:06:42 crc kubenswrapper[4867]: I1006 13:06:42.874176 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:06:42 crc kubenswrapper[4867]: I1006 13:06:42.874490 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:06:42 crc kubenswrapper[4867]: I1006 13:06:42.942586 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rhkpx" Oct 06 13:06:43 crc kubenswrapper[4867]: I1006 13:06:43.004449 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8t2sq" event={"ID":"b78c9415-85bd-40db-b44f-f1e04797a66e","Type":"ContainerStarted","Data":"20210fce7a016fc975a8ce3e446ff9e0ee68d0738aad6492e31d75bb11797b6f"} Oct 06 13:06:43 crc kubenswrapper[4867]: I1006 13:06:43.032661 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8t2sq" podStartSLOduration=162.03262031 podStartE2EDuration="2m42.03262031s" podCreationTimestamp="2025-10-06 13:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:06:43.021355735 +0000 UTC m=+182.479303879" watchObservedRunningTime="2025-10-06 13:06:43.03262031 +0000 UTC m=+182.490568464" Oct 06 13:06:44 crc kubenswrapper[4867]: I1006 13:06:44.014450 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65gkn" event={"ID":"4d2f5af5-a716-4878-ac8b-81c7636ffd7e","Type":"ContainerStarted","Data":"03205b186cca1f21bd51bff49e0ec83d0e93cbd4a0d5350cde9dd3e3ee54bdba"} Oct 06 13:06:44 crc kubenswrapper[4867]: I1006 13:06:44.017517 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwk8q" event={"ID":"7971c0f6-6d96-4e00-9bda-003c54ad5d53","Type":"ContainerStarted","Data":"3d29b019f77c40278a2dd97db4724425cced02bed0e00ae70e7a897fd586e5cc"} Oct 06 13:06:44 crc kubenswrapper[4867]: I1006 13:06:44.021975 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8xqc" event={"ID":"b253d2fc-0c97-4110-8e96-f127181ff1ff","Type":"ContainerStarted","Data":"32f81932e9808f6e57c5eb945ab85e5b662b9059dea67ecd670e43f6cc8ae974"} Oct 06 13:06:44 crc kubenswrapper[4867]: I1006 13:06:44.031875 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-65gkn" podStartSLOduration=3.014956742 podStartE2EDuration="32.031854079s" podCreationTimestamp="2025-10-06 13:06:12 +0000 UTC" firstStartedPulling="2025-10-06 13:06:14.716651548 +0000 UTC m=+154.174599692" lastFinishedPulling="2025-10-06 13:06:43.733548885 +0000 UTC m=+183.191497029" observedRunningTime="2025-10-06 13:06:44.031550611 +0000 UTC m=+183.489498745" watchObservedRunningTime="2025-10-06 13:06:44.031854079 +0000 UTC m=+183.489802223" Oct 06 13:06:44 crc kubenswrapper[4867]: I1006 13:06:44.051525 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x8xqc" podStartSLOduration=2.762729409 podStartE2EDuration="35.051510245s" podCreationTimestamp="2025-10-06 13:06:09 +0000 UTC" firstStartedPulling="2025-10-06 13:06:11.555701829 +0000 UTC m=+151.013649973" lastFinishedPulling="2025-10-06 13:06:43.844482665 +0000 UTC m=+183.302430809" observedRunningTime="2025-10-06 13:06:44.049369209 +0000 UTC m=+183.507317353" watchObservedRunningTime="2025-10-06 13:06:44.051510245 +0000 UTC m=+183.509458389" Oct 06 13:06:44 crc kubenswrapper[4867]: I1006 13:06:44.067213 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cwk8q" podStartSLOduration=1.847934225 podStartE2EDuration="35.067176986s" podCreationTimestamp="2025-10-06 13:06:09 +0000 UTC" firstStartedPulling="2025-10-06 13:06:10.508532473 +0000 UTC m=+149.966480617" lastFinishedPulling="2025-10-06 13:06:43.727775234 +0000 UTC m=+183.185723378" observedRunningTime="2025-10-06 13:06:44.065707677 +0000 UTC m=+183.523655841" watchObservedRunningTime="2025-10-06 13:06:44.067176986 +0000 UTC m=+183.525125140" Oct 06 13:06:45 crc kubenswrapper[4867]: I1006 13:06:45.029757 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d98tk" event={"ID":"ae5cd503-d6b1-4056-a0b8-b302a68d404b","Type":"ContainerStarted","Data":"050665c6e7b44d73eadeaf36323ac99b4c63fb0a31bb76d4770383306bd065c2"} Oct 06 13:06:45 crc kubenswrapper[4867]: I1006 13:06:45.051169 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d98tk" podStartSLOduration=3.790599937 podStartE2EDuration="33.051149585s" podCreationTimestamp="2025-10-06 13:06:12 +0000 UTC" firstStartedPulling="2025-10-06 13:06:14.687903544 +0000 UTC m=+154.145851688" lastFinishedPulling="2025-10-06 13:06:43.948453182 +0000 UTC m=+183.406401336" observedRunningTime="2025-10-06 13:06:45.04793292 +0000 UTC m=+184.505881064" watchObservedRunningTime="2025-10-06 13:06:45.051149585 +0000 UTC m=+184.509097729" Oct 06 13:06:49 crc kubenswrapper[4867]: I1006 13:06:49.186417 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 13:06:49 crc kubenswrapper[4867]: I1006 13:06:49.449750 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:49 crc kubenswrapper[4867]: I1006 13:06:49.449810 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:49 crc kubenswrapper[4867]: I1006 13:06:49.903697 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:49 crc kubenswrapper[4867]: I1006 13:06:49.904493 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:50 crc kubenswrapper[4867]: I1006 13:06:50.342421 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:50 crc kubenswrapper[4867]: I1006 13:06:50.342803 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:50 crc kubenswrapper[4867]: I1006 13:06:50.404507 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:06:51 crc kubenswrapper[4867]: I1006 13:06:51.125279 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:51 crc kubenswrapper[4867]: I1006 13:06:51.883282 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x8xqc"] Oct 06 13:06:52 crc kubenswrapper[4867]: I1006 13:06:52.625530 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:52 crc kubenswrapper[4867]: I1006 13:06:52.626016 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:52 crc kubenswrapper[4867]: I1006 13:06:52.678387 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.018171 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.018681 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.077169 4867 generic.go:334] "Generic (PLEG): container finished" podID="e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" containerID="7d7bb2d2cac207083455ee279e64fbc445374bc0465c50ab701c12602d6eae3f" exitCode=0 Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.077311 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx8v4" event={"ID":"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d","Type":"ContainerDied","Data":"7d7bb2d2cac207083455ee279e64fbc445374bc0465c50ab701c12602d6eae3f"} Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.078377 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x8xqc" podUID="b253d2fc-0c97-4110-8e96-f127181ff1ff" containerName="registry-server" containerID="cri-o://32f81932e9808f6e57c5eb945ab85e5b662b9059dea67ecd670e43f6cc8ae974" gracePeriod=2 Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.093054 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.126061 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.628914 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.714213 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lq75\" (UniqueName: \"kubernetes.io/projected/b253d2fc-0c97-4110-8e96-f127181ff1ff-kube-api-access-9lq75\") pod \"b253d2fc-0c97-4110-8e96-f127181ff1ff\" (UID: \"b253d2fc-0c97-4110-8e96-f127181ff1ff\") " Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.714355 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b253d2fc-0c97-4110-8e96-f127181ff1ff-catalog-content\") pod \"b253d2fc-0c97-4110-8e96-f127181ff1ff\" (UID: \"b253d2fc-0c97-4110-8e96-f127181ff1ff\") " Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.714418 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b253d2fc-0c97-4110-8e96-f127181ff1ff-utilities\") pod \"b253d2fc-0c97-4110-8e96-f127181ff1ff\" (UID: \"b253d2fc-0c97-4110-8e96-f127181ff1ff\") " Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.715397 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b253d2fc-0c97-4110-8e96-f127181ff1ff-utilities" (OuterVolumeSpecName: "utilities") pod "b253d2fc-0c97-4110-8e96-f127181ff1ff" (UID: "b253d2fc-0c97-4110-8e96-f127181ff1ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.724439 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b253d2fc-0c97-4110-8e96-f127181ff1ff-kube-api-access-9lq75" (OuterVolumeSpecName: "kube-api-access-9lq75") pod "b253d2fc-0c97-4110-8e96-f127181ff1ff" (UID: "b253d2fc-0c97-4110-8e96-f127181ff1ff"). InnerVolumeSpecName "kube-api-access-9lq75". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.759768 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b253d2fc-0c97-4110-8e96-f127181ff1ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b253d2fc-0c97-4110-8e96-f127181ff1ff" (UID: "b253d2fc-0c97-4110-8e96-f127181ff1ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.816013 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b253d2fc-0c97-4110-8e96-f127181ff1ff-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.816066 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b253d2fc-0c97-4110-8e96-f127181ff1ff-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:53 crc kubenswrapper[4867]: I1006 13:06:53.816079 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lq75\" (UniqueName: \"kubernetes.io/projected/b253d2fc-0c97-4110-8e96-f127181ff1ff-kube-api-access-9lq75\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.085380 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx8v4" event={"ID":"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d","Type":"ContainerStarted","Data":"f656fc16b92a7eccb9ecdce17f36a8762698dbc53fbf8c2bcf38fa649394fe54"} Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.087497 4867 generic.go:334] "Generic (PLEG): container finished" podID="c8449de1-c234-45f5-a6dc-e1e300785924" containerID="72d61cef51f574675ad138d6f4989e3a886ec0e845a59fe47c544b094e05be4d" exitCode=0 Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.087566 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6djpm" event={"ID":"c8449de1-c234-45f5-a6dc-e1e300785924","Type":"ContainerDied","Data":"72d61cef51f574675ad138d6f4989e3a886ec0e845a59fe47c544b094e05be4d"} Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.092625 4867 generic.go:334] "Generic (PLEG): container finished" podID="fa3ccdb7-011e-4f41-9599-dab85489545e" containerID="b79348a59faee846058b905c5c39be6425adefa0d04dd12361115fafb3fe9c3c" exitCode=0 Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.092683 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzpjw" event={"ID":"fa3ccdb7-011e-4f41-9599-dab85489545e","Type":"ContainerDied","Data":"b79348a59faee846058b905c5c39be6425adefa0d04dd12361115fafb3fe9c3c"} Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.096087 4867 generic.go:334] "Generic (PLEG): container finished" podID="b253d2fc-0c97-4110-8e96-f127181ff1ff" containerID="32f81932e9808f6e57c5eb945ab85e5b662b9059dea67ecd670e43f6cc8ae974" exitCode=0 Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.096201 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8xqc" event={"ID":"b253d2fc-0c97-4110-8e96-f127181ff1ff","Type":"ContainerDied","Data":"32f81932e9808f6e57c5eb945ab85e5b662b9059dea67ecd670e43f6cc8ae974"} Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.096227 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x8xqc" Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.096257 4867 scope.go:117] "RemoveContainer" containerID="32f81932e9808f6e57c5eb945ab85e5b662b9059dea67ecd670e43f6cc8ae974" Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.096239 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8xqc" event={"ID":"b253d2fc-0c97-4110-8e96-f127181ff1ff","Type":"ContainerDied","Data":"c0a24bbdc8dcb4e48b37476a97ed1f32e1ba23bcb86a0fb24ccb78a5fec5c052"} Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.121443 4867 scope.go:117] "RemoveContainer" containerID="3737cf805e43f6e4c78d0371583f1533fe38ce51b15aba234144f7ab193fd801" Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.133489 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hx8v4" podStartSLOduration=2.853560701 podStartE2EDuration="45.133464295s" podCreationTimestamp="2025-10-06 13:06:09 +0000 UTC" firstStartedPulling="2025-10-06 13:06:11.571603416 +0000 UTC m=+151.029551560" lastFinishedPulling="2025-10-06 13:06:53.85150701 +0000 UTC m=+193.309455154" observedRunningTime="2025-10-06 13:06:54.130727693 +0000 UTC m=+193.588675837" watchObservedRunningTime="2025-10-06 13:06:54.133464295 +0000 UTC m=+193.591412439" Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.179797 4867 scope.go:117] "RemoveContainer" containerID="1369df4560fdc0bc66ff67ef9ce01c7cf6b4ac499a17f45a85fff28311ac17d9" Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.204544 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.207447 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x8xqc"] Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.214437 4867 scope.go:117] "RemoveContainer" containerID="32f81932e9808f6e57c5eb945ab85e5b662b9059dea67ecd670e43f6cc8ae974" Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.214751 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x8xqc"] Oct 06 13:06:54 crc kubenswrapper[4867]: E1006 13:06:54.214984 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32f81932e9808f6e57c5eb945ab85e5b662b9059dea67ecd670e43f6cc8ae974\": container with ID starting with 32f81932e9808f6e57c5eb945ab85e5b662b9059dea67ecd670e43f6cc8ae974 not found: ID does not exist" containerID="32f81932e9808f6e57c5eb945ab85e5b662b9059dea67ecd670e43f6cc8ae974" Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.215011 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f81932e9808f6e57c5eb945ab85e5b662b9059dea67ecd670e43f6cc8ae974"} err="failed to get container status \"32f81932e9808f6e57c5eb945ab85e5b662b9059dea67ecd670e43f6cc8ae974\": rpc error: code = NotFound desc = could not find container \"32f81932e9808f6e57c5eb945ab85e5b662b9059dea67ecd670e43f6cc8ae974\": container with ID starting with 32f81932e9808f6e57c5eb945ab85e5b662b9059dea67ecd670e43f6cc8ae974 not found: ID does not exist" Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.215068 4867 scope.go:117] "RemoveContainer" containerID="3737cf805e43f6e4c78d0371583f1533fe38ce51b15aba234144f7ab193fd801" Oct 06 13:06:54 crc kubenswrapper[4867]: E1006 13:06:54.215685 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3737cf805e43f6e4c78d0371583f1533fe38ce51b15aba234144f7ab193fd801\": container with ID starting with 3737cf805e43f6e4c78d0371583f1533fe38ce51b15aba234144f7ab193fd801 not found: ID does not exist" containerID="3737cf805e43f6e4c78d0371583f1533fe38ce51b15aba234144f7ab193fd801" Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.215711 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3737cf805e43f6e4c78d0371583f1533fe38ce51b15aba234144f7ab193fd801"} err="failed to get container status \"3737cf805e43f6e4c78d0371583f1533fe38ce51b15aba234144f7ab193fd801\": rpc error: code = NotFound desc = could not find container \"3737cf805e43f6e4c78d0371583f1533fe38ce51b15aba234144f7ab193fd801\": container with ID starting with 3737cf805e43f6e4c78d0371583f1533fe38ce51b15aba234144f7ab193fd801 not found: ID does not exist" Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.215727 4867 scope.go:117] "RemoveContainer" containerID="1369df4560fdc0bc66ff67ef9ce01c7cf6b4ac499a17f45a85fff28311ac17d9" Oct 06 13:06:54 crc kubenswrapper[4867]: E1006 13:06:54.215944 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1369df4560fdc0bc66ff67ef9ce01c7cf6b4ac499a17f45a85fff28311ac17d9\": container with ID starting with 1369df4560fdc0bc66ff67ef9ce01c7cf6b4ac499a17f45a85fff28311ac17d9 not found: ID does not exist" containerID="1369df4560fdc0bc66ff67ef9ce01c7cf6b4ac499a17f45a85fff28311ac17d9" Oct 06 13:06:54 crc kubenswrapper[4867]: I1006 13:06:54.215963 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1369df4560fdc0bc66ff67ef9ce01c7cf6b4ac499a17f45a85fff28311ac17d9"} err="failed to get container status \"1369df4560fdc0bc66ff67ef9ce01c7cf6b4ac499a17f45a85fff28311ac17d9\": rpc error: code = NotFound desc = could not find container \"1369df4560fdc0bc66ff67ef9ce01c7cf6b4ac499a17f45a85fff28311ac17d9\": container with ID starting with 1369df4560fdc0bc66ff67ef9ce01c7cf6b4ac499a17f45a85fff28311ac17d9 not found: ID does not exist" Oct 06 13:06:55 crc kubenswrapper[4867]: I1006 13:06:55.119375 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6djpm" event={"ID":"c8449de1-c234-45f5-a6dc-e1e300785924","Type":"ContainerStarted","Data":"042975c6f9e0817358769b64a5e0509599b05ec60214c598e34a8bae08810ab8"} Oct 06 13:06:55 crc kubenswrapper[4867]: I1006 13:06:55.142633 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzpjw" event={"ID":"fa3ccdb7-011e-4f41-9599-dab85489545e","Type":"ContainerStarted","Data":"54740426ebf1651c843ead157cb78b7dc79be7d09e532997ad43b69a64da05f3"} Oct 06 13:06:55 crc kubenswrapper[4867]: I1006 13:06:55.144157 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6djpm" podStartSLOduration=1.971830142 podStartE2EDuration="44.144135014s" podCreationTimestamp="2025-10-06 13:06:11 +0000 UTC" firstStartedPulling="2025-10-06 13:06:12.603852951 +0000 UTC m=+152.061801095" lastFinishedPulling="2025-10-06 13:06:54.776157823 +0000 UTC m=+194.234105967" observedRunningTime="2025-10-06 13:06:55.142329007 +0000 UTC m=+194.600277161" watchObservedRunningTime="2025-10-06 13:06:55.144135014 +0000 UTC m=+194.602083158" Oct 06 13:06:55 crc kubenswrapper[4867]: I1006 13:06:55.170661 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zzpjw" podStartSLOduration=2.908615325 podStartE2EDuration="46.170633919s" podCreationTimestamp="2025-10-06 13:06:09 +0000 UTC" firstStartedPulling="2025-10-06 13:06:11.575775975 +0000 UTC m=+151.033724119" lastFinishedPulling="2025-10-06 13:06:54.837794569 +0000 UTC m=+194.295742713" observedRunningTime="2025-10-06 13:06:55.166867001 +0000 UTC m=+194.624815145" watchObservedRunningTime="2025-10-06 13:06:55.170633919 +0000 UTC m=+194.628582063" Oct 06 13:06:55 crc kubenswrapper[4867]: I1006 13:06:55.230529 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b253d2fc-0c97-4110-8e96-f127181ff1ff" path="/var/lib/kubelet/pods/b253d2fc-0c97-4110-8e96-f127181ff1ff/volumes" Oct 06 13:06:56 crc kubenswrapper[4867]: I1006 13:06:56.484698 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d98tk"] Oct 06 13:06:56 crc kubenswrapper[4867]: I1006 13:06:56.485224 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d98tk" podUID="ae5cd503-d6b1-4056-a0b8-b302a68d404b" containerName="registry-server" containerID="cri-o://050665c6e7b44d73eadeaf36323ac99b4c63fb0a31bb76d4770383306bd065c2" gracePeriod=2 Oct 06 13:06:56 crc kubenswrapper[4867]: I1006 13:06:56.849425 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:56 crc kubenswrapper[4867]: I1006 13:06:56.963354 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae5cd503-d6b1-4056-a0b8-b302a68d404b-utilities\") pod \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\" (UID: \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\") " Oct 06 13:06:56 crc kubenswrapper[4867]: I1006 13:06:56.963444 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae5cd503-d6b1-4056-a0b8-b302a68d404b-catalog-content\") pod \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\" (UID: \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\") " Oct 06 13:06:56 crc kubenswrapper[4867]: I1006 13:06:56.963476 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g225c\" (UniqueName: \"kubernetes.io/projected/ae5cd503-d6b1-4056-a0b8-b302a68d404b-kube-api-access-g225c\") pod \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\" (UID: \"ae5cd503-d6b1-4056-a0b8-b302a68d404b\") " Oct 06 13:06:56 crc kubenswrapper[4867]: I1006 13:06:56.964480 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae5cd503-d6b1-4056-a0b8-b302a68d404b-utilities" (OuterVolumeSpecName: "utilities") pod "ae5cd503-d6b1-4056-a0b8-b302a68d404b" (UID: "ae5cd503-d6b1-4056-a0b8-b302a68d404b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:06:56 crc kubenswrapper[4867]: I1006 13:06:56.970205 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae5cd503-d6b1-4056-a0b8-b302a68d404b-kube-api-access-g225c" (OuterVolumeSpecName: "kube-api-access-g225c") pod "ae5cd503-d6b1-4056-a0b8-b302a68d404b" (UID: "ae5cd503-d6b1-4056-a0b8-b302a68d404b"). InnerVolumeSpecName "kube-api-access-g225c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.047964 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae5cd503-d6b1-4056-a0b8-b302a68d404b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae5cd503-d6b1-4056-a0b8-b302a68d404b" (UID: "ae5cd503-d6b1-4056-a0b8-b302a68d404b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.065556 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae5cd503-d6b1-4056-a0b8-b302a68d404b-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.065601 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae5cd503-d6b1-4056-a0b8-b302a68d404b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.065615 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g225c\" (UniqueName: \"kubernetes.io/projected/ae5cd503-d6b1-4056-a0b8-b302a68d404b-kube-api-access-g225c\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.158674 4867 generic.go:334] "Generic (PLEG): container finished" podID="ae5cd503-d6b1-4056-a0b8-b302a68d404b" containerID="050665c6e7b44d73eadeaf36323ac99b4c63fb0a31bb76d4770383306bd065c2" exitCode=0 Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.158751 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d98tk" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.158744 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d98tk" event={"ID":"ae5cd503-d6b1-4056-a0b8-b302a68d404b","Type":"ContainerDied","Data":"050665c6e7b44d73eadeaf36323ac99b4c63fb0a31bb76d4770383306bd065c2"} Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.158911 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d98tk" event={"ID":"ae5cd503-d6b1-4056-a0b8-b302a68d404b","Type":"ContainerDied","Data":"ef209bc8d9bbb429f7c6dd113bbe367f7f445b5f7860aafc741a7c6325a89445"} Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.158940 4867 scope.go:117] "RemoveContainer" containerID="050665c6e7b44d73eadeaf36323ac99b4c63fb0a31bb76d4770383306bd065c2" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.185626 4867 scope.go:117] "RemoveContainer" containerID="7885e903dd569b4886eb2b3f6fcc9ea49f5d227b2a54dd2131265216f4184c75" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.193978 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d98tk"] Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.198008 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d98tk"] Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.212444 4867 scope.go:117] "RemoveContainer" containerID="a48d0133b5ea33215232539d0bf1785e93ca16402382f78b99221d15801fb567" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.229686 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae5cd503-d6b1-4056-a0b8-b302a68d404b" path="/var/lib/kubelet/pods/ae5cd503-d6b1-4056-a0b8-b302a68d404b/volumes" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.230044 4867 scope.go:117] "RemoveContainer" containerID="050665c6e7b44d73eadeaf36323ac99b4c63fb0a31bb76d4770383306bd065c2" Oct 06 13:06:57 crc kubenswrapper[4867]: E1006 13:06:57.230372 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"050665c6e7b44d73eadeaf36323ac99b4c63fb0a31bb76d4770383306bd065c2\": container with ID starting with 050665c6e7b44d73eadeaf36323ac99b4c63fb0a31bb76d4770383306bd065c2 not found: ID does not exist" containerID="050665c6e7b44d73eadeaf36323ac99b4c63fb0a31bb76d4770383306bd065c2" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.230404 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050665c6e7b44d73eadeaf36323ac99b4c63fb0a31bb76d4770383306bd065c2"} err="failed to get container status \"050665c6e7b44d73eadeaf36323ac99b4c63fb0a31bb76d4770383306bd065c2\": rpc error: code = NotFound desc = could not find container \"050665c6e7b44d73eadeaf36323ac99b4c63fb0a31bb76d4770383306bd065c2\": container with ID starting with 050665c6e7b44d73eadeaf36323ac99b4c63fb0a31bb76d4770383306bd065c2 not found: ID does not exist" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.230426 4867 scope.go:117] "RemoveContainer" containerID="7885e903dd569b4886eb2b3f6fcc9ea49f5d227b2a54dd2131265216f4184c75" Oct 06 13:06:57 crc kubenswrapper[4867]: E1006 13:06:57.231003 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7885e903dd569b4886eb2b3f6fcc9ea49f5d227b2a54dd2131265216f4184c75\": container with ID starting with 7885e903dd569b4886eb2b3f6fcc9ea49f5d227b2a54dd2131265216f4184c75 not found: ID does not exist" containerID="7885e903dd569b4886eb2b3f6fcc9ea49f5d227b2a54dd2131265216f4184c75" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.231040 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7885e903dd569b4886eb2b3f6fcc9ea49f5d227b2a54dd2131265216f4184c75"} err="failed to get container status \"7885e903dd569b4886eb2b3f6fcc9ea49f5d227b2a54dd2131265216f4184c75\": rpc error: code = NotFound desc = could not find container \"7885e903dd569b4886eb2b3f6fcc9ea49f5d227b2a54dd2131265216f4184c75\": container with ID starting with 7885e903dd569b4886eb2b3f6fcc9ea49f5d227b2a54dd2131265216f4184c75 not found: ID does not exist" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.231052 4867 scope.go:117] "RemoveContainer" containerID="a48d0133b5ea33215232539d0bf1785e93ca16402382f78b99221d15801fb567" Oct 06 13:06:57 crc kubenswrapper[4867]: E1006 13:06:57.231280 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a48d0133b5ea33215232539d0bf1785e93ca16402382f78b99221d15801fb567\": container with ID starting with a48d0133b5ea33215232539d0bf1785e93ca16402382f78b99221d15801fb567 not found: ID does not exist" containerID="a48d0133b5ea33215232539d0bf1785e93ca16402382f78b99221d15801fb567" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.231299 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a48d0133b5ea33215232539d0bf1785e93ca16402382f78b99221d15801fb567"} err="failed to get container status \"a48d0133b5ea33215232539d0bf1785e93ca16402382f78b99221d15801fb567\": rpc error: code = NotFound desc = could not find container \"a48d0133b5ea33215232539d0bf1785e93ca16402382f78b99221d15801fb567\": container with ID starting with a48d0133b5ea33215232539d0bf1785e93ca16402382f78b99221d15801fb567 not found: ID does not exist" Oct 06 13:06:57 crc kubenswrapper[4867]: I1006 13:06:57.913456 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ftjpf"] Oct 06 13:06:58 crc kubenswrapper[4867]: I1006 13:06:58.168238 4867 generic.go:334] "Generic (PLEG): container finished" podID="74e08994-05ef-40f2-903f-bb6450059e88" containerID="7cb69bffb95c9e7f4264b716cb72839c81c5a5982427527368c5e85706aeb8ca" exitCode=0 Oct 06 13:06:58 crc kubenswrapper[4867]: I1006 13:06:58.168308 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv65z" event={"ID":"74e08994-05ef-40f2-903f-bb6450059e88","Type":"ContainerDied","Data":"7cb69bffb95c9e7f4264b716cb72839c81c5a5982427527368c5e85706aeb8ca"} Oct 06 13:06:59 crc kubenswrapper[4867]: I1006 13:06:59.178852 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv65z" event={"ID":"74e08994-05ef-40f2-903f-bb6450059e88","Type":"ContainerStarted","Data":"4555cac3fda8704edad0596b8326caef39a49ee91bd9cf31cad3d286ae7bc939"} Oct 06 13:06:59 crc kubenswrapper[4867]: I1006 13:06:59.203507 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wv65z" podStartSLOduration=2.200519512 podStartE2EDuration="48.203482828s" podCreationTimestamp="2025-10-06 13:06:11 +0000 UTC" firstStartedPulling="2025-10-06 13:06:12.634064794 +0000 UTC m=+152.092012938" lastFinishedPulling="2025-10-06 13:06:58.63702811 +0000 UTC m=+198.094976254" observedRunningTime="2025-10-06 13:06:59.201337641 +0000 UTC m=+198.659285785" watchObservedRunningTime="2025-10-06 13:06:59.203482828 +0000 UTC m=+198.661430972" Oct 06 13:06:59 crc kubenswrapper[4867]: I1006 13:06:59.661197 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:06:59 crc kubenswrapper[4867]: I1006 13:06:59.661336 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:06:59 crc kubenswrapper[4867]: I1006 13:06:59.703084 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:07:00 crc kubenswrapper[4867]: I1006 13:07:00.106492 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:07:00 crc kubenswrapper[4867]: I1006 13:07:00.107118 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:07:00 crc kubenswrapper[4867]: I1006 13:07:00.154359 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:07:00 crc kubenswrapper[4867]: I1006 13:07:00.226315 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:07:00 crc kubenswrapper[4867]: I1006 13:07:00.230436 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:07:01 crc kubenswrapper[4867]: I1006 13:07:01.600398 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:07:01 crc kubenswrapper[4867]: I1006 13:07:01.600452 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:07:01 crc kubenswrapper[4867]: I1006 13:07:01.646953 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:07:02 crc kubenswrapper[4867]: I1006 13:07:02.011243 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:07:02 crc kubenswrapper[4867]: I1006 13:07:02.011362 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:07:02 crc kubenswrapper[4867]: I1006 13:07:02.061987 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:07:02 crc kubenswrapper[4867]: I1006 13:07:02.240038 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:07:02 crc kubenswrapper[4867]: I1006 13:07:02.884096 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6djpm"] Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.205060 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6djpm" podUID="c8449de1-c234-45f5-a6dc-e1e300785924" containerName="registry-server" containerID="cri-o://042975c6f9e0817358769b64a5e0509599b05ec60214c598e34a8bae08810ab8" gracePeriod=2 Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.297156 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hx8v4"] Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.298047 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hx8v4" podUID="e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" containerName="registry-server" containerID="cri-o://f656fc16b92a7eccb9ecdce17f36a8762698dbc53fbf8c2bcf38fa649394fe54" gracePeriod=2 Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.580572 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.691198 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.703851 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8449de1-c234-45f5-a6dc-e1e300785924-utilities\") pod \"c8449de1-c234-45f5-a6dc-e1e300785924\" (UID: \"c8449de1-c234-45f5-a6dc-e1e300785924\") " Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.703971 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfbw5\" (UniqueName: \"kubernetes.io/projected/c8449de1-c234-45f5-a6dc-e1e300785924-kube-api-access-mfbw5\") pod \"c8449de1-c234-45f5-a6dc-e1e300785924\" (UID: \"c8449de1-c234-45f5-a6dc-e1e300785924\") " Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.704677 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8449de1-c234-45f5-a6dc-e1e300785924-utilities" (OuterVolumeSpecName: "utilities") pod "c8449de1-c234-45f5-a6dc-e1e300785924" (UID: "c8449de1-c234-45f5-a6dc-e1e300785924"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.705194 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8449de1-c234-45f5-a6dc-e1e300785924-catalog-content\") pod \"c8449de1-c234-45f5-a6dc-e1e300785924\" (UID: \"c8449de1-c234-45f5-a6dc-e1e300785924\") " Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.705746 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8449de1-c234-45f5-a6dc-e1e300785924-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.713546 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8449de1-c234-45f5-a6dc-e1e300785924-kube-api-access-mfbw5" (OuterVolumeSpecName: "kube-api-access-mfbw5") pod "c8449de1-c234-45f5-a6dc-e1e300785924" (UID: "c8449de1-c234-45f5-a6dc-e1e300785924"). InnerVolumeSpecName "kube-api-access-mfbw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.719494 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8449de1-c234-45f5-a6dc-e1e300785924-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8449de1-c234-45f5-a6dc-e1e300785924" (UID: "c8449de1-c234-45f5-a6dc-e1e300785924"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.806334 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p69pz\" (UniqueName: \"kubernetes.io/projected/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-kube-api-access-p69pz\") pod \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\" (UID: \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\") " Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.806446 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-utilities\") pod \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\" (UID: \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\") " Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.806477 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-catalog-content\") pod \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\" (UID: \"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d\") " Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.806675 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfbw5\" (UniqueName: \"kubernetes.io/projected/c8449de1-c234-45f5-a6dc-e1e300785924-kube-api-access-mfbw5\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.806688 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8449de1-c234-45f5-a6dc-e1e300785924-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.807160 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-utilities" (OuterVolumeSpecName: "utilities") pod "e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" (UID: "e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.809193 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-kube-api-access-p69pz" (OuterVolumeSpecName: "kube-api-access-p69pz") pod "e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" (UID: "e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d"). InnerVolumeSpecName "kube-api-access-p69pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.848156 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" (UID: "e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.908110 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p69pz\" (UniqueName: \"kubernetes.io/projected/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-kube-api-access-p69pz\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.908155 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:04 crc kubenswrapper[4867]: I1006 13:07:04.908168 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.211062 4867 generic.go:334] "Generic (PLEG): container finished" podID="c8449de1-c234-45f5-a6dc-e1e300785924" containerID="042975c6f9e0817358769b64a5e0509599b05ec60214c598e34a8bae08810ab8" exitCode=0 Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.211136 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6djpm" event={"ID":"c8449de1-c234-45f5-a6dc-e1e300785924","Type":"ContainerDied","Data":"042975c6f9e0817358769b64a5e0509599b05ec60214c598e34a8bae08810ab8"} Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.211154 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6djpm" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.211179 4867 scope.go:117] "RemoveContainer" containerID="042975c6f9e0817358769b64a5e0509599b05ec60214c598e34a8bae08810ab8" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.211166 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6djpm" event={"ID":"c8449de1-c234-45f5-a6dc-e1e300785924","Type":"ContainerDied","Data":"b6df1b9e00f091f33a7789243500b1e472e6eab42dfe98dcb2e8f5a3f26344ec"} Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.213606 4867 generic.go:334] "Generic (PLEG): container finished" podID="e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" containerID="f656fc16b92a7eccb9ecdce17f36a8762698dbc53fbf8c2bcf38fa649394fe54" exitCode=0 Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.213691 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hx8v4" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.213757 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx8v4" event={"ID":"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d","Type":"ContainerDied","Data":"f656fc16b92a7eccb9ecdce17f36a8762698dbc53fbf8c2bcf38fa649394fe54"} Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.213842 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx8v4" event={"ID":"e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d","Type":"ContainerDied","Data":"1c5ba34e65b009ff7566376c24cd1c2c8b5dd23ad07a64bda7ffc868884eee22"} Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.226857 4867 scope.go:117] "RemoveContainer" containerID="72d61cef51f574675ad138d6f4989e3a886ec0e845a59fe47c544b094e05be4d" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.253995 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6djpm"] Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.257339 4867 scope.go:117] "RemoveContainer" containerID="f71809e7d75a9e7dbc7185bf146befffdd8601c3cbb9a7831d17a2bd7faa054b" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.272994 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6djpm"] Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.285574 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hx8v4"] Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.288343 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hx8v4"] Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.288512 4867 scope.go:117] "RemoveContainer" containerID="042975c6f9e0817358769b64a5e0509599b05ec60214c598e34a8bae08810ab8" Oct 06 13:07:05 crc kubenswrapper[4867]: E1006 13:07:05.290695 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"042975c6f9e0817358769b64a5e0509599b05ec60214c598e34a8bae08810ab8\": container with ID starting with 042975c6f9e0817358769b64a5e0509599b05ec60214c598e34a8bae08810ab8 not found: ID does not exist" containerID="042975c6f9e0817358769b64a5e0509599b05ec60214c598e34a8bae08810ab8" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.290843 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"042975c6f9e0817358769b64a5e0509599b05ec60214c598e34a8bae08810ab8"} err="failed to get container status \"042975c6f9e0817358769b64a5e0509599b05ec60214c598e34a8bae08810ab8\": rpc error: code = NotFound desc = could not find container \"042975c6f9e0817358769b64a5e0509599b05ec60214c598e34a8bae08810ab8\": container with ID starting with 042975c6f9e0817358769b64a5e0509599b05ec60214c598e34a8bae08810ab8 not found: ID does not exist" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.290935 4867 scope.go:117] "RemoveContainer" containerID="72d61cef51f574675ad138d6f4989e3a886ec0e845a59fe47c544b094e05be4d" Oct 06 13:07:05 crc kubenswrapper[4867]: E1006 13:07:05.291391 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d61cef51f574675ad138d6f4989e3a886ec0e845a59fe47c544b094e05be4d\": container with ID starting with 72d61cef51f574675ad138d6f4989e3a886ec0e845a59fe47c544b094e05be4d not found: ID does not exist" containerID="72d61cef51f574675ad138d6f4989e3a886ec0e845a59fe47c544b094e05be4d" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.291493 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d61cef51f574675ad138d6f4989e3a886ec0e845a59fe47c544b094e05be4d"} err="failed to get container status \"72d61cef51f574675ad138d6f4989e3a886ec0e845a59fe47c544b094e05be4d\": rpc error: code = NotFound desc = could not find container \"72d61cef51f574675ad138d6f4989e3a886ec0e845a59fe47c544b094e05be4d\": container with ID starting with 72d61cef51f574675ad138d6f4989e3a886ec0e845a59fe47c544b094e05be4d not found: ID does not exist" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.291563 4867 scope.go:117] "RemoveContainer" containerID="f71809e7d75a9e7dbc7185bf146befffdd8601c3cbb9a7831d17a2bd7faa054b" Oct 06 13:07:05 crc kubenswrapper[4867]: E1006 13:07:05.298237 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71809e7d75a9e7dbc7185bf146befffdd8601c3cbb9a7831d17a2bd7faa054b\": container with ID starting with f71809e7d75a9e7dbc7185bf146befffdd8601c3cbb9a7831d17a2bd7faa054b not found: ID does not exist" containerID="f71809e7d75a9e7dbc7185bf146befffdd8601c3cbb9a7831d17a2bd7faa054b" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.298314 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71809e7d75a9e7dbc7185bf146befffdd8601c3cbb9a7831d17a2bd7faa054b"} err="failed to get container status \"f71809e7d75a9e7dbc7185bf146befffdd8601c3cbb9a7831d17a2bd7faa054b\": rpc error: code = NotFound desc = could not find container \"f71809e7d75a9e7dbc7185bf146befffdd8601c3cbb9a7831d17a2bd7faa054b\": container with ID starting with f71809e7d75a9e7dbc7185bf146befffdd8601c3cbb9a7831d17a2bd7faa054b not found: ID does not exist" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.298348 4867 scope.go:117] "RemoveContainer" containerID="f656fc16b92a7eccb9ecdce17f36a8762698dbc53fbf8c2bcf38fa649394fe54" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.311049 4867 scope.go:117] "RemoveContainer" containerID="7d7bb2d2cac207083455ee279e64fbc445374bc0465c50ab701c12602d6eae3f" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.338245 4867 scope.go:117] "RemoveContainer" containerID="00f4c2cd9cb4237305267ff488e2f690bdfe8b71106491db12e5ad17307dfa08" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.354783 4867 scope.go:117] "RemoveContainer" containerID="f656fc16b92a7eccb9ecdce17f36a8762698dbc53fbf8c2bcf38fa649394fe54" Oct 06 13:07:05 crc kubenswrapper[4867]: E1006 13:07:05.355522 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f656fc16b92a7eccb9ecdce17f36a8762698dbc53fbf8c2bcf38fa649394fe54\": container with ID starting with f656fc16b92a7eccb9ecdce17f36a8762698dbc53fbf8c2bcf38fa649394fe54 not found: ID does not exist" containerID="f656fc16b92a7eccb9ecdce17f36a8762698dbc53fbf8c2bcf38fa649394fe54" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.355555 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f656fc16b92a7eccb9ecdce17f36a8762698dbc53fbf8c2bcf38fa649394fe54"} err="failed to get container status \"f656fc16b92a7eccb9ecdce17f36a8762698dbc53fbf8c2bcf38fa649394fe54\": rpc error: code = NotFound desc = could not find container \"f656fc16b92a7eccb9ecdce17f36a8762698dbc53fbf8c2bcf38fa649394fe54\": container with ID starting with f656fc16b92a7eccb9ecdce17f36a8762698dbc53fbf8c2bcf38fa649394fe54 not found: ID does not exist" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.355576 4867 scope.go:117] "RemoveContainer" containerID="7d7bb2d2cac207083455ee279e64fbc445374bc0465c50ab701c12602d6eae3f" Oct 06 13:07:05 crc kubenswrapper[4867]: E1006 13:07:05.355923 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d7bb2d2cac207083455ee279e64fbc445374bc0465c50ab701c12602d6eae3f\": container with ID starting with 7d7bb2d2cac207083455ee279e64fbc445374bc0465c50ab701c12602d6eae3f not found: ID does not exist" containerID="7d7bb2d2cac207083455ee279e64fbc445374bc0465c50ab701c12602d6eae3f" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.355945 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7bb2d2cac207083455ee279e64fbc445374bc0465c50ab701c12602d6eae3f"} err="failed to get container status \"7d7bb2d2cac207083455ee279e64fbc445374bc0465c50ab701c12602d6eae3f\": rpc error: code = NotFound desc = could not find container \"7d7bb2d2cac207083455ee279e64fbc445374bc0465c50ab701c12602d6eae3f\": container with ID starting with 7d7bb2d2cac207083455ee279e64fbc445374bc0465c50ab701c12602d6eae3f not found: ID does not exist" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.355961 4867 scope.go:117] "RemoveContainer" containerID="00f4c2cd9cb4237305267ff488e2f690bdfe8b71106491db12e5ad17307dfa08" Oct 06 13:07:05 crc kubenswrapper[4867]: E1006 13:07:05.356294 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00f4c2cd9cb4237305267ff488e2f690bdfe8b71106491db12e5ad17307dfa08\": container with ID starting with 00f4c2cd9cb4237305267ff488e2f690bdfe8b71106491db12e5ad17307dfa08 not found: ID does not exist" containerID="00f4c2cd9cb4237305267ff488e2f690bdfe8b71106491db12e5ad17307dfa08" Oct 06 13:07:05 crc kubenswrapper[4867]: I1006 13:07:05.356319 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f4c2cd9cb4237305267ff488e2f690bdfe8b71106491db12e5ad17307dfa08"} err="failed to get container status \"00f4c2cd9cb4237305267ff488e2f690bdfe8b71106491db12e5ad17307dfa08\": rpc error: code = NotFound desc = could not find container \"00f4c2cd9cb4237305267ff488e2f690bdfe8b71106491db12e5ad17307dfa08\": container with ID starting with 00f4c2cd9cb4237305267ff488e2f690bdfe8b71106491db12e5ad17307dfa08 not found: ID does not exist" Oct 06 13:07:07 crc kubenswrapper[4867]: I1006 13:07:07.230169 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8449de1-c234-45f5-a6dc-e1e300785924" path="/var/lib/kubelet/pods/c8449de1-c234-45f5-a6dc-e1e300785924/volumes" Oct 06 13:07:07 crc kubenswrapper[4867]: I1006 13:07:07.231335 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" path="/var/lib/kubelet/pods/e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d/volumes" Oct 06 13:07:11 crc kubenswrapper[4867]: I1006 13:07:11.656918 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:07:12 crc kubenswrapper[4867]: I1006 13:07:12.874067 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:07:12 crc kubenswrapper[4867]: I1006 13:07:12.874565 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:07:12 crc kubenswrapper[4867]: I1006 13:07:12.874626 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:07:12 crc kubenswrapper[4867]: I1006 13:07:12.875438 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:07:12 crc kubenswrapper[4867]: I1006 13:07:12.875497 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5" gracePeriod=600 Oct 06 13:07:13 crc kubenswrapper[4867]: I1006 13:07:13.261928 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5" exitCode=0 Oct 06 13:07:13 crc kubenswrapper[4867]: I1006 13:07:13.262108 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5"} Oct 06 13:07:13 crc kubenswrapper[4867]: I1006 13:07:13.262484 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"7ebc66b3368265481d7ce17a498c1898a6fa0d78101df6ffd71b9d951872175a"} Oct 06 13:07:22 crc kubenswrapper[4867]: I1006 13:07:22.940057 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" podUID="fa021310-c3a2-4feb-93a7-0b2eb6307147" containerName="oauth-openshift" containerID="cri-o://eff23dc662105c5333071fa89927258b06bbb2f3d97487fef7d0dbec738a61c9" gracePeriod=15 Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.323610 4867 generic.go:334] "Generic (PLEG): container finished" podID="fa021310-c3a2-4feb-93a7-0b2eb6307147" containerID="eff23dc662105c5333071fa89927258b06bbb2f3d97487fef7d0dbec738a61c9" exitCode=0 Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.323756 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" event={"ID":"fa021310-c3a2-4feb-93a7-0b2eb6307147","Type":"ContainerDied","Data":"eff23dc662105c5333071fa89927258b06bbb2f3d97487fef7d0dbec738a61c9"} Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.324122 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" event={"ID":"fa021310-c3a2-4feb-93a7-0b2eb6307147","Type":"ContainerDied","Data":"f9043228bbef51a1c80d575276e4a56f43447b754a13df57a16e473d586e5268"} Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.324145 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9043228bbef51a1c80d575276e4a56f43447b754a13df57a16e473d586e5268" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.358431 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.411719 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-84cc499644-gcqnj"] Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412269 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8449de1-c234-45f5-a6dc-e1e300785924" containerName="extract-utilities" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412292 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8449de1-c234-45f5-a6dc-e1e300785924" containerName="extract-utilities" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412305 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" containerName="extract-content" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412314 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" containerName="extract-content" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412336 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa021310-c3a2-4feb-93a7-0b2eb6307147" containerName="oauth-openshift" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412345 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa021310-c3a2-4feb-93a7-0b2eb6307147" containerName="oauth-openshift" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412364 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8449de1-c234-45f5-a6dc-e1e300785924" containerName="registry-server" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412374 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8449de1-c234-45f5-a6dc-e1e300785924" containerName="registry-server" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412393 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8449de1-c234-45f5-a6dc-e1e300785924" containerName="extract-content" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412403 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8449de1-c234-45f5-a6dc-e1e300785924" containerName="extract-content" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412421 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5cd503-d6b1-4056-a0b8-b302a68d404b" containerName="extract-content" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412429 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5cd503-d6b1-4056-a0b8-b302a68d404b" containerName="extract-content" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412447 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5cd503-d6b1-4056-a0b8-b302a68d404b" containerName="registry-server" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412456 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5cd503-d6b1-4056-a0b8-b302a68d404b" containerName="registry-server" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412476 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376a730a-070b-44d8-8654-555b9c5925c0" containerName="pruner" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412485 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="376a730a-070b-44d8-8654-555b9c5925c0" containerName="pruner" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412507 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b253d2fc-0c97-4110-8e96-f127181ff1ff" containerName="extract-content" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412516 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b253d2fc-0c97-4110-8e96-f127181ff1ff" containerName="extract-content" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412532 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b253d2fc-0c97-4110-8e96-f127181ff1ff" containerName="extract-utilities" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412541 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b253d2fc-0c97-4110-8e96-f127181ff1ff" containerName="extract-utilities" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412551 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91dee131-1a24-40d0-9e1d-b9e4e3e2906e" containerName="pruner" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412561 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="91dee131-1a24-40d0-9e1d-b9e4e3e2906e" containerName="pruner" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412578 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" containerName="registry-server" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412586 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" containerName="registry-server" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412596 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" containerName="extract-utilities" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412605 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" containerName="extract-utilities" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412623 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598298ab-238a-4776-9e23-e66c273dc805" containerName="collect-profiles" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412632 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="598298ab-238a-4776-9e23-e66c273dc805" containerName="collect-profiles" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412651 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5cd503-d6b1-4056-a0b8-b302a68d404b" containerName="extract-utilities" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412660 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5cd503-d6b1-4056-a0b8-b302a68d404b" containerName="extract-utilities" Oct 06 13:07:23 crc kubenswrapper[4867]: E1006 13:07:23.412674 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b253d2fc-0c97-4110-8e96-f127181ff1ff" containerName="registry-server" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412684 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b253d2fc-0c97-4110-8e96-f127181ff1ff" containerName="registry-server" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412907 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="376a730a-070b-44d8-8654-555b9c5925c0" containerName="pruner" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412925 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa021310-c3a2-4feb-93a7-0b2eb6307147" containerName="oauth-openshift" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412950 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8449de1-c234-45f5-a6dc-e1e300785924" containerName="registry-server" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412966 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="598298ab-238a-4776-9e23-e66c273dc805" containerName="collect-profiles" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.412985 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="91dee131-1a24-40d0-9e1d-b9e4e3e2906e" containerName="pruner" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.413004 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b253d2fc-0c97-4110-8e96-f127181ff1ff" containerName="registry-server" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.413022 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5b6fa8f-e40a-4d59-ae1e-df6eeeb0971d" containerName="registry-server" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.413034 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae5cd503-d6b1-4056-a0b8-b302a68d404b" containerName="registry-server" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.413805 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.429945 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-84cc499644-gcqnj"] Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.484114 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-session\") pod \"fa021310-c3a2-4feb-93a7-0b2eb6307147\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.484475 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fa021310-c3a2-4feb-93a7-0b2eb6307147-audit-dir\") pod \"fa021310-c3a2-4feb-93a7-0b2eb6307147\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.484601 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-provider-selection\") pod \"fa021310-c3a2-4feb-93a7-0b2eb6307147\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.484726 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-error\") pod \"fa021310-c3a2-4feb-93a7-0b2eb6307147\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.485429 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-cliconfig\") pod \"fa021310-c3a2-4feb-93a7-0b2eb6307147\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.485524 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-idp-0-file-data\") pod \"fa021310-c3a2-4feb-93a7-0b2eb6307147\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.485610 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-service-ca\") pod \"fa021310-c3a2-4feb-93a7-0b2eb6307147\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.485751 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-ocp-branding-template\") pod \"fa021310-c3a2-4feb-93a7-0b2eb6307147\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.485925 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-serving-cert\") pod \"fa021310-c3a2-4feb-93a7-0b2eb6307147\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.486043 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgz92\" (UniqueName: \"kubernetes.io/projected/fa021310-c3a2-4feb-93a7-0b2eb6307147-kube-api-access-xgz92\") pod \"fa021310-c3a2-4feb-93a7-0b2eb6307147\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.486143 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-trusted-ca-bundle\") pod \"fa021310-c3a2-4feb-93a7-0b2eb6307147\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.484603 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa021310-c3a2-4feb-93a7-0b2eb6307147-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fa021310-c3a2-4feb-93a7-0b2eb6307147" (UID: "fa021310-c3a2-4feb-93a7-0b2eb6307147"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.486237 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-login\") pod \"fa021310-c3a2-4feb-93a7-0b2eb6307147\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.486451 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-audit-policies\") pod \"fa021310-c3a2-4feb-93a7-0b2eb6307147\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.486570 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-router-certs\") pod \"fa021310-c3a2-4feb-93a7-0b2eb6307147\" (UID: \"fa021310-c3a2-4feb-93a7-0b2eb6307147\") " Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.486825 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvdb9\" (UniqueName: \"kubernetes.io/projected/bbbf96c0-9380-48b4-9a0a-7b487982dd87-kube-api-access-fvdb9\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.486930 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bbbf96c0-9380-48b4-9a0a-7b487982dd87-audit-dir\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.487047 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.487161 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-service-ca\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.486481 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "fa021310-c3a2-4feb-93a7-0b2eb6307147" (UID: "fa021310-c3a2-4feb-93a7-0b2eb6307147"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.486855 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "fa021310-c3a2-4feb-93a7-0b2eb6307147" (UID: "fa021310-c3a2-4feb-93a7-0b2eb6307147"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.487045 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "fa021310-c3a2-4feb-93a7-0b2eb6307147" (UID: "fa021310-c3a2-4feb-93a7-0b2eb6307147"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.487281 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "fa021310-c3a2-4feb-93a7-0b2eb6307147" (UID: "fa021310-c3a2-4feb-93a7-0b2eb6307147"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.487462 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.487630 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-router-certs\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.487686 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-serving-cert\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.487728 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.487781 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-cliconfig\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.487836 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.487862 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-user-template-error\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.487888 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-session\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.487949 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bbbf96c0-9380-48b4-9a0a-7b487982dd87-audit-policies\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.487986 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-user-template-login\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.488465 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.488503 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.488521 4867 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.488539 4867 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fa021310-c3a2-4feb-93a7-0b2eb6307147-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.488553 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.491672 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "fa021310-c3a2-4feb-93a7-0b2eb6307147" (UID: "fa021310-c3a2-4feb-93a7-0b2eb6307147"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.492306 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "fa021310-c3a2-4feb-93a7-0b2eb6307147" (UID: "fa021310-c3a2-4feb-93a7-0b2eb6307147"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.494690 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa021310-c3a2-4feb-93a7-0b2eb6307147-kube-api-access-xgz92" (OuterVolumeSpecName: "kube-api-access-xgz92") pod "fa021310-c3a2-4feb-93a7-0b2eb6307147" (UID: "fa021310-c3a2-4feb-93a7-0b2eb6307147"). InnerVolumeSpecName "kube-api-access-xgz92". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.494831 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "fa021310-c3a2-4feb-93a7-0b2eb6307147" (UID: "fa021310-c3a2-4feb-93a7-0b2eb6307147"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.495355 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "fa021310-c3a2-4feb-93a7-0b2eb6307147" (UID: "fa021310-c3a2-4feb-93a7-0b2eb6307147"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.496642 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "fa021310-c3a2-4feb-93a7-0b2eb6307147" (UID: "fa021310-c3a2-4feb-93a7-0b2eb6307147"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.497002 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "fa021310-c3a2-4feb-93a7-0b2eb6307147" (UID: "fa021310-c3a2-4feb-93a7-0b2eb6307147"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.496908 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "fa021310-c3a2-4feb-93a7-0b2eb6307147" (UID: "fa021310-c3a2-4feb-93a7-0b2eb6307147"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.497228 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "fa021310-c3a2-4feb-93a7-0b2eb6307147" (UID: "fa021310-c3a2-4feb-93a7-0b2eb6307147"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590000 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvdb9\" (UniqueName: \"kubernetes.io/projected/bbbf96c0-9380-48b4-9a0a-7b487982dd87-kube-api-access-fvdb9\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590094 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bbbf96c0-9380-48b4-9a0a-7b487982dd87-audit-dir\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590169 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590217 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-service-ca\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590277 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590323 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-router-certs\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590365 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-serving-cert\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590398 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590439 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-cliconfig\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590491 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-user-template-error\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590526 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590559 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-session\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590708 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bbbf96c0-9380-48b4-9a0a-7b487982dd87-audit-policies\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590753 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-user-template-login\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590853 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590885 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590908 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590929 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590950 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgz92\" (UniqueName: \"kubernetes.io/projected/fa021310-c3a2-4feb-93a7-0b2eb6307147-kube-api-access-xgz92\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590969 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.590988 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.591007 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.591027 4867 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fa021310-c3a2-4feb-93a7-0b2eb6307147-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.592394 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bbbf96c0-9380-48b4-9a0a-7b487982dd87-audit-dir\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.592437 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.592735 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bbbf96c0-9380-48b4-9a0a-7b487982dd87-audit-policies\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.593019 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-cliconfig\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.593090 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-service-ca\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.596773 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-router-certs\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.597287 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-session\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.598518 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-user-template-login\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.599598 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.599635 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-user-template-error\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.600180 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-serving-cert\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.600716 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.603399 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bbbf96c0-9380-48b4-9a0a-7b487982dd87-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.611349 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvdb9\" (UniqueName: \"kubernetes.io/projected/bbbf96c0-9380-48b4-9a0a-7b487982dd87-kube-api-access-fvdb9\") pod \"oauth-openshift-84cc499644-gcqnj\" (UID: \"bbbf96c0-9380-48b4-9a0a-7b487982dd87\") " pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.741628 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:23 crc kubenswrapper[4867]: I1006 13:07:23.953350 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-84cc499644-gcqnj"] Oct 06 13:07:24 crc kubenswrapper[4867]: I1006 13:07:24.332680 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" event={"ID":"bbbf96c0-9380-48b4-9a0a-7b487982dd87","Type":"ContainerStarted","Data":"f1eb4814f57583240b2d5b3e7d19d2c362a434ea9d588b0cb9d576b72e47a847"} Oct 06 13:07:24 crc kubenswrapper[4867]: I1006 13:07:24.332712 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ftjpf" Oct 06 13:07:24 crc kubenswrapper[4867]: I1006 13:07:24.333236 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" event={"ID":"bbbf96c0-9380-48b4-9a0a-7b487982dd87","Type":"ContainerStarted","Data":"af10a7d16d402cddc7c9f107ab2f7cd2323862310d03b036452eb1082abf668e"} Oct 06 13:07:24 crc kubenswrapper[4867]: I1006 13:07:24.333902 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:24 crc kubenswrapper[4867]: I1006 13:07:24.335456 4867 patch_prober.go:28] interesting pod/oauth-openshift-84cc499644-gcqnj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.54:6443/healthz\": dial tcp 10.217.0.54:6443: connect: connection refused" start-of-body= Oct 06 13:07:24 crc kubenswrapper[4867]: I1006 13:07:24.335543 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" podUID="bbbf96c0-9380-48b4-9a0a-7b487982dd87" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.54:6443/healthz\": dial tcp 10.217.0.54:6443: connect: connection refused" Oct 06 13:07:24 crc kubenswrapper[4867]: I1006 13:07:24.383736 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" podStartSLOduration=27.383698744 podStartE2EDuration="27.383698744s" podCreationTimestamp="2025-10-06 13:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:07:24.369875342 +0000 UTC m=+223.827823486" watchObservedRunningTime="2025-10-06 13:07:24.383698744 +0000 UTC m=+223.841646928" Oct 06 13:07:24 crc kubenswrapper[4867]: I1006 13:07:24.388317 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ftjpf"] Oct 06 13:07:24 crc kubenswrapper[4867]: I1006 13:07:24.393636 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ftjpf"] Oct 06 13:07:25 crc kubenswrapper[4867]: I1006 13:07:25.229853 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa021310-c3a2-4feb-93a7-0b2eb6307147" path="/var/lib/kubelet/pods/fa021310-c3a2-4feb-93a7-0b2eb6307147/volumes" Oct 06 13:07:25 crc kubenswrapper[4867]: I1006 13:07:25.344461 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-84cc499644-gcqnj" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.277945 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zzpjw"] Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.278817 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zzpjw" podUID="fa3ccdb7-011e-4f41-9599-dab85489545e" containerName="registry-server" containerID="cri-o://54740426ebf1651c843ead157cb78b7dc79be7d09e532997ad43b69a64da05f3" gracePeriod=30 Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.284074 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cwk8q"] Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.290397 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xj8lg"] Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.290600 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" podUID="0e1c41fc-3e0d-4048-97d2-54c54bc065e5" containerName="marketplace-operator" containerID="cri-o://fd204990e6cadcbd504a59fa94e64c39886451d8a0e74d818b22ecbde7380614" gracePeriod=30 Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.290889 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cwk8q" podUID="7971c0f6-6d96-4e00-9bda-003c54ad5d53" containerName="registry-server" containerID="cri-o://3d29b019f77c40278a2dd97db4724425cced02bed0e00ae70e7a897fd586e5cc" gracePeriod=30 Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.331535 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv65z"] Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.331901 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wv65z" podUID="74e08994-05ef-40f2-903f-bb6450059e88" containerName="registry-server" containerID="cri-o://4555cac3fda8704edad0596b8326caef39a49ee91bd9cf31cad3d286ae7bc939" gracePeriod=30 Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.334669 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-65gkn"] Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.334876 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-65gkn" podUID="4d2f5af5-a716-4878-ac8b-81c7636ffd7e" containerName="registry-server" containerID="cri-o://03205b186cca1f21bd51bff49e0ec83d0e93cbd4a0d5350cde9dd3e3ee54bdba" gracePeriod=30 Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.348893 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rdmr9"] Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.349610 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.356990 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rdmr9"] Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.472642 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/298bf2ee-baaf-4fbb-a107-d712667f246e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rdmr9\" (UID: \"298bf2ee-baaf-4fbb-a107-d712667f246e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.472693 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/298bf2ee-baaf-4fbb-a107-d712667f246e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rdmr9\" (UID: \"298bf2ee-baaf-4fbb-a107-d712667f246e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.472974 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mvtq\" (UniqueName: \"kubernetes.io/projected/298bf2ee-baaf-4fbb-a107-d712667f246e-kube-api-access-6mvtq\") pod \"marketplace-operator-79b997595-rdmr9\" (UID: \"298bf2ee-baaf-4fbb-a107-d712667f246e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.549504 4867 generic.go:334] "Generic (PLEG): container finished" podID="7971c0f6-6d96-4e00-9bda-003c54ad5d53" containerID="3d29b019f77c40278a2dd97db4724425cced02bed0e00ae70e7a897fd586e5cc" exitCode=0 Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.549619 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwk8q" event={"ID":"7971c0f6-6d96-4e00-9bda-003c54ad5d53","Type":"ContainerDied","Data":"3d29b019f77c40278a2dd97db4724425cced02bed0e00ae70e7a897fd586e5cc"} Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.552795 4867 generic.go:334] "Generic (PLEG): container finished" podID="fa3ccdb7-011e-4f41-9599-dab85489545e" containerID="54740426ebf1651c843ead157cb78b7dc79be7d09e532997ad43b69a64da05f3" exitCode=0 Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.552889 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzpjw" event={"ID":"fa3ccdb7-011e-4f41-9599-dab85489545e","Type":"ContainerDied","Data":"54740426ebf1651c843ead157cb78b7dc79be7d09e532997ad43b69a64da05f3"} Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.559269 4867 generic.go:334] "Generic (PLEG): container finished" podID="0e1c41fc-3e0d-4048-97d2-54c54bc065e5" containerID="fd204990e6cadcbd504a59fa94e64c39886451d8a0e74d818b22ecbde7380614" exitCode=0 Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.559351 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" event={"ID":"0e1c41fc-3e0d-4048-97d2-54c54bc065e5","Type":"ContainerDied","Data":"fd204990e6cadcbd504a59fa94e64c39886451d8a0e74d818b22ecbde7380614"} Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.563141 4867 generic.go:334] "Generic (PLEG): container finished" podID="74e08994-05ef-40f2-903f-bb6450059e88" containerID="4555cac3fda8704edad0596b8326caef39a49ee91bd9cf31cad3d286ae7bc939" exitCode=0 Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.563209 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv65z" event={"ID":"74e08994-05ef-40f2-903f-bb6450059e88","Type":"ContainerDied","Data":"4555cac3fda8704edad0596b8326caef39a49ee91bd9cf31cad3d286ae7bc939"} Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.566882 4867 generic.go:334] "Generic (PLEG): container finished" podID="4d2f5af5-a716-4878-ac8b-81c7636ffd7e" containerID="03205b186cca1f21bd51bff49e0ec83d0e93cbd4a0d5350cde9dd3e3ee54bdba" exitCode=0 Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.566918 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65gkn" event={"ID":"4d2f5af5-a716-4878-ac8b-81c7636ffd7e","Type":"ContainerDied","Data":"03205b186cca1f21bd51bff49e0ec83d0e93cbd4a0d5350cde9dd3e3ee54bdba"} Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.575125 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mvtq\" (UniqueName: \"kubernetes.io/projected/298bf2ee-baaf-4fbb-a107-d712667f246e-kube-api-access-6mvtq\") pod \"marketplace-operator-79b997595-rdmr9\" (UID: \"298bf2ee-baaf-4fbb-a107-d712667f246e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.575199 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/298bf2ee-baaf-4fbb-a107-d712667f246e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rdmr9\" (UID: \"298bf2ee-baaf-4fbb-a107-d712667f246e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.575228 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/298bf2ee-baaf-4fbb-a107-d712667f246e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rdmr9\" (UID: \"298bf2ee-baaf-4fbb-a107-d712667f246e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.578853 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/298bf2ee-baaf-4fbb-a107-d712667f246e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rdmr9\" (UID: \"298bf2ee-baaf-4fbb-a107-d712667f246e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.585158 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/298bf2ee-baaf-4fbb-a107-d712667f246e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rdmr9\" (UID: \"298bf2ee-baaf-4fbb-a107-d712667f246e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.592784 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mvtq\" (UniqueName: \"kubernetes.io/projected/298bf2ee-baaf-4fbb-a107-d712667f246e-kube-api-access-6mvtq\") pod \"marketplace-operator-79b997595-rdmr9\" (UID: \"298bf2ee-baaf-4fbb-a107-d712667f246e\") " pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.683824 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.775502 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.785147 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.786090 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.798108 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.879709 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e08994-05ef-40f2-903f-bb6450059e88-catalog-content\") pod \"74e08994-05ef-40f2-903f-bb6450059e88\" (UID: \"74e08994-05ef-40f2-903f-bb6450059e88\") " Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.879760 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k994s\" (UniqueName: \"kubernetes.io/projected/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-kube-api-access-k994s\") pod \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\" (UID: \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\") " Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.879801 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq9fb\" (UniqueName: \"kubernetes.io/projected/74e08994-05ef-40f2-903f-bb6450059e88-kube-api-access-nq9fb\") pod \"74e08994-05ef-40f2-903f-bb6450059e88\" (UID: \"74e08994-05ef-40f2-903f-bb6450059e88\") " Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.879844 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e08994-05ef-40f2-903f-bb6450059e88-utilities\") pod \"74e08994-05ef-40f2-903f-bb6450059e88\" (UID: \"74e08994-05ef-40f2-903f-bb6450059e88\") " Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.879886 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-utilities\") pod \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\" (UID: \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\") " Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.879930 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-catalog-content\") pod \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\" (UID: \"4d2f5af5-a716-4878-ac8b-81c7636ffd7e\") " Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.881091 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-utilities" (OuterVolumeSpecName: "utilities") pod "4d2f5af5-a716-4878-ac8b-81c7636ffd7e" (UID: "4d2f5af5-a716-4878-ac8b-81c7636ffd7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.881664 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e08994-05ef-40f2-903f-bb6450059e88-utilities" (OuterVolumeSpecName: "utilities") pod "74e08994-05ef-40f2-903f-bb6450059e88" (UID: "74e08994-05ef-40f2-903f-bb6450059e88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.884738 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e08994-05ef-40f2-903f-bb6450059e88-kube-api-access-nq9fb" (OuterVolumeSpecName: "kube-api-access-nq9fb") pod "74e08994-05ef-40f2-903f-bb6450059e88" (UID: "74e08994-05ef-40f2-903f-bb6450059e88"). InnerVolumeSpecName "kube-api-access-nq9fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.885101 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-kube-api-access-k994s" (OuterVolumeSpecName: "kube-api-access-k994s") pod "4d2f5af5-a716-4878-ac8b-81c7636ffd7e" (UID: "4d2f5af5-a716-4878-ac8b-81c7636ffd7e"). InnerVolumeSpecName "kube-api-access-k994s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.898320 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e08994-05ef-40f2-903f-bb6450059e88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74e08994-05ef-40f2-903f-bb6450059e88" (UID: "74e08994-05ef-40f2-903f-bb6450059e88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.976194 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d2f5af5-a716-4878-ac8b-81c7636ffd7e" (UID: "4d2f5af5-a716-4878-ac8b-81c7636ffd7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.980917 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa3ccdb7-011e-4f41-9599-dab85489545e-catalog-content\") pod \"fa3ccdb7-011e-4f41-9599-dab85489545e\" (UID: \"fa3ccdb7-011e-4f41-9599-dab85489545e\") " Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.981006 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn296\" (UniqueName: \"kubernetes.io/projected/fa3ccdb7-011e-4f41-9599-dab85489545e-kube-api-access-wn296\") pod \"fa3ccdb7-011e-4f41-9599-dab85489545e\" (UID: \"fa3ccdb7-011e-4f41-9599-dab85489545e\") " Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.981058 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-marketplace-operator-metrics\") pod \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\" (UID: \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\") " Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.981140 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-marketplace-trusted-ca\") pod \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\" (UID: \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\") " Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.981179 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ztc2\" (UniqueName: \"kubernetes.io/projected/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-kube-api-access-4ztc2\") pod \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\" (UID: \"0e1c41fc-3e0d-4048-97d2-54c54bc065e5\") " Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.981246 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa3ccdb7-011e-4f41-9599-dab85489545e-utilities\") pod \"fa3ccdb7-011e-4f41-9599-dab85489545e\" (UID: \"fa3ccdb7-011e-4f41-9599-dab85489545e\") " Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.981604 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq9fb\" (UniqueName: \"kubernetes.io/projected/74e08994-05ef-40f2-903f-bb6450059e88-kube-api-access-nq9fb\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.981638 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e08994-05ef-40f2-903f-bb6450059e88-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.981657 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.981676 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.981688 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e08994-05ef-40f2-903f-bb6450059e88-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.981700 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k994s\" (UniqueName: \"kubernetes.io/projected/4d2f5af5-a716-4878-ac8b-81c7636ffd7e-kube-api-access-k994s\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.982757 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa3ccdb7-011e-4f41-9599-dab85489545e-utilities" (OuterVolumeSpecName: "utilities") pod "fa3ccdb7-011e-4f41-9599-dab85489545e" (UID: "fa3ccdb7-011e-4f41-9599-dab85489545e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.982887 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "0e1c41fc-3e0d-4048-97d2-54c54bc065e5" (UID: "0e1c41fc-3e0d-4048-97d2-54c54bc065e5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.984916 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa3ccdb7-011e-4f41-9599-dab85489545e-kube-api-access-wn296" (OuterVolumeSpecName: "kube-api-access-wn296") pod "fa3ccdb7-011e-4f41-9599-dab85489545e" (UID: "fa3ccdb7-011e-4f41-9599-dab85489545e"). InnerVolumeSpecName "kube-api-access-wn296". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.985381 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-kube-api-access-4ztc2" (OuterVolumeSpecName: "kube-api-access-4ztc2") pod "0e1c41fc-3e0d-4048-97d2-54c54bc065e5" (UID: "0e1c41fc-3e0d-4048-97d2-54c54bc065e5"). InnerVolumeSpecName "kube-api-access-4ztc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:07:53 crc kubenswrapper[4867]: I1006 13:07:53.985669 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "0e1c41fc-3e0d-4048-97d2-54c54bc065e5" (UID: "0e1c41fc-3e0d-4048-97d2-54c54bc065e5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.028517 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa3ccdb7-011e-4f41-9599-dab85489545e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa3ccdb7-011e-4f41-9599-dab85489545e" (UID: "fa3ccdb7-011e-4f41-9599-dab85489545e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.085022 4867 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.085058 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ztc2\" (UniqueName: \"kubernetes.io/projected/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-kube-api-access-4ztc2\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.085067 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa3ccdb7-011e-4f41-9599-dab85489545e-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.085076 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa3ccdb7-011e-4f41-9599-dab85489545e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.085086 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn296\" (UniqueName: \"kubernetes.io/projected/fa3ccdb7-011e-4f41-9599-dab85489545e-kube-api-access-wn296\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.085094 4867 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0e1c41fc-3e0d-4048-97d2-54c54bc065e5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.100901 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rdmr9"] Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.148827 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.288085 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n72zz\" (UniqueName: \"kubernetes.io/projected/7971c0f6-6d96-4e00-9bda-003c54ad5d53-kube-api-access-n72zz\") pod \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\" (UID: \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\") " Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.288177 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7971c0f6-6d96-4e00-9bda-003c54ad5d53-catalog-content\") pod \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\" (UID: \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\") " Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.288343 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7971c0f6-6d96-4e00-9bda-003c54ad5d53-utilities\") pod \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\" (UID: \"7971c0f6-6d96-4e00-9bda-003c54ad5d53\") " Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.289412 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7971c0f6-6d96-4e00-9bda-003c54ad5d53-utilities" (OuterVolumeSpecName: "utilities") pod "7971c0f6-6d96-4e00-9bda-003c54ad5d53" (UID: "7971c0f6-6d96-4e00-9bda-003c54ad5d53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.294761 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7971c0f6-6d96-4e00-9bda-003c54ad5d53-kube-api-access-n72zz" (OuterVolumeSpecName: "kube-api-access-n72zz") pod "7971c0f6-6d96-4e00-9bda-003c54ad5d53" (UID: "7971c0f6-6d96-4e00-9bda-003c54ad5d53"). InnerVolumeSpecName "kube-api-access-n72zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.367225 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7971c0f6-6d96-4e00-9bda-003c54ad5d53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7971c0f6-6d96-4e00-9bda-003c54ad5d53" (UID: "7971c0f6-6d96-4e00-9bda-003c54ad5d53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.390458 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n72zz\" (UniqueName: \"kubernetes.io/projected/7971c0f6-6d96-4e00-9bda-003c54ad5d53-kube-api-access-n72zz\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.390508 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7971c0f6-6d96-4e00-9bda-003c54ad5d53-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.390526 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7971c0f6-6d96-4e00-9bda-003c54ad5d53-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.574477 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwk8q" event={"ID":"7971c0f6-6d96-4e00-9bda-003c54ad5d53","Type":"ContainerDied","Data":"a51e9ce9297aea581ed6f1c66fd8989d1b039b0c503d8bd877419d771321a34b"} Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.574549 4867 scope.go:117] "RemoveContainer" containerID="3d29b019f77c40278a2dd97db4724425cced02bed0e00ae70e7a897fd586e5cc" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.574747 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwk8q" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.580173 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" event={"ID":"298bf2ee-baaf-4fbb-a107-d712667f246e","Type":"ContainerStarted","Data":"2b64ad608baa474171384b7c8e0e25ea4112f202a866511251cd41cd7f21678c"} Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.586838 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" event={"ID":"298bf2ee-baaf-4fbb-a107-d712667f246e","Type":"ContainerStarted","Data":"3336d64f365765cad12ef913630de26d91f4ee35b3621710644e2c079a73239f"} Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.587954 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.589561 4867 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rdmr9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.589629 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" podUID="298bf2ee-baaf-4fbb-a107-d712667f246e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.592445 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzpjw" event={"ID":"fa3ccdb7-011e-4f41-9599-dab85489545e","Type":"ContainerDied","Data":"558117691be34bc3e81d4883b9c48e77ecbf0acc8ef608cce02cb58d1ab333d2"} Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.592667 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzpjw" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.596546 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" event={"ID":"0e1c41fc-3e0d-4048-97d2-54c54bc065e5","Type":"ContainerDied","Data":"858e07a946a4fcc0475b2e70cf9b00e347fafe163324d1c0e8099989e3bad15c"} Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.596752 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xj8lg" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.602615 4867 scope.go:117] "RemoveContainer" containerID="43dc9e0893f1e942f25577ea72894058d44bc3b0f537f19ddf1d0f3ea5aaff6d" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.627086 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" podStartSLOduration=1.62704109 podStartE2EDuration="1.62704109s" podCreationTimestamp="2025-10-06 13:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:07:54.625663882 +0000 UTC m=+254.083612066" watchObservedRunningTime="2025-10-06 13:07:54.62704109 +0000 UTC m=+254.084989234" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.634124 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wv65z" event={"ID":"74e08994-05ef-40f2-903f-bb6450059e88","Type":"ContainerDied","Data":"f1d92ee0377ee4b3431a4880c5f3d13499881bb55ac109b1542deefe69b70cf8"} Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.635229 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wv65z" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.637848 4867 scope.go:117] "RemoveContainer" containerID="12e97de2de7b500d756a2dd9d05ef2d8e05b15cbc83730fd2c10dd4c2d0fdfb0" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.646172 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65gkn" event={"ID":"4d2f5af5-a716-4878-ac8b-81c7636ffd7e","Type":"ContainerDied","Data":"abb6e6f8c24a73597c6bb46f2350cb0c82ec68e2f9d62375c5af34d4f3e94f00"} Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.646319 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65gkn" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.658106 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cwk8q"] Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.669647 4867 scope.go:117] "RemoveContainer" containerID="54740426ebf1651c843ead157cb78b7dc79be7d09e532997ad43b69a64da05f3" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.670440 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cwk8q"] Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.680615 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zzpjw"] Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.683673 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zzpjw"] Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.690519 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv65z"] Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.695200 4867 scope.go:117] "RemoveContainer" containerID="b79348a59faee846058b905c5c39be6425adefa0d04dd12361115fafb3fe9c3c" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.698466 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wv65z"] Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.720299 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-65gkn"] Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.722769 4867 scope.go:117] "RemoveContainer" containerID="8c2244dbf2ffee738e3ae4d3d432d3b9b26ad407dc0fcba2b29316bf9f351f07" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.728954 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-65gkn"] Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.731432 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xj8lg"] Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.736201 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xj8lg"] Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.736647 4867 scope.go:117] "RemoveContainer" containerID="fd204990e6cadcbd504a59fa94e64c39886451d8a0e74d818b22ecbde7380614" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.751079 4867 scope.go:117] "RemoveContainer" containerID="4555cac3fda8704edad0596b8326caef39a49ee91bd9cf31cad3d286ae7bc939" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.768439 4867 scope.go:117] "RemoveContainer" containerID="7cb69bffb95c9e7f4264b716cb72839c81c5a5982427527368c5e85706aeb8ca" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.782291 4867 scope.go:117] "RemoveContainer" containerID="493d3d01e1c0cba5cf3400fa9847f94517df07b68ed8f80ea9eb328ef056d47c" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.796538 4867 scope.go:117] "RemoveContainer" containerID="03205b186cca1f21bd51bff49e0ec83d0e93cbd4a0d5350cde9dd3e3ee54bdba" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.810335 4867 scope.go:117] "RemoveContainer" containerID="6b74cf990b57afb9f1a8df3aa59c58818066ea486002c47509996dd0dd430e58" Oct 06 13:07:54 crc kubenswrapper[4867]: I1006 13:07:54.824487 4867 scope.go:117] "RemoveContainer" containerID="b3fc6dbadd3586c49ab59c12d855fbdaa1f28f8dee0dbc4a461064f9779bb955" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.229771 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e1c41fc-3e0d-4048-97d2-54c54bc065e5" path="/var/lib/kubelet/pods/0e1c41fc-3e0d-4048-97d2-54c54bc065e5/volumes" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.230559 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d2f5af5-a716-4878-ac8b-81c7636ffd7e" path="/var/lib/kubelet/pods/4d2f5af5-a716-4878-ac8b-81c7636ffd7e/volumes" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.231219 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e08994-05ef-40f2-903f-bb6450059e88" path="/var/lib/kubelet/pods/74e08994-05ef-40f2-903f-bb6450059e88/volumes" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.231822 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7971c0f6-6d96-4e00-9bda-003c54ad5d53" path="/var/lib/kubelet/pods/7971c0f6-6d96-4e00-9bda-003c54ad5d53/volumes" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.232463 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa3ccdb7-011e-4f41-9599-dab85489545e" path="/var/lib/kubelet/pods/fa3ccdb7-011e-4f41-9599-dab85489545e/volumes" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.289525 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-khr9q"] Oct 06 13:07:55 crc kubenswrapper[4867]: E1006 13:07:55.289759 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e08994-05ef-40f2-903f-bb6450059e88" containerName="extract-utilities" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.289773 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e08994-05ef-40f2-903f-bb6450059e88" containerName="extract-utilities" Oct 06 13:07:55 crc kubenswrapper[4867]: E1006 13:07:55.289787 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2f5af5-a716-4878-ac8b-81c7636ffd7e" containerName="registry-server" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.289795 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2f5af5-a716-4878-ac8b-81c7636ffd7e" containerName="registry-server" Oct 06 13:07:55 crc kubenswrapper[4867]: E1006 13:07:55.289802 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e08994-05ef-40f2-903f-bb6450059e88" containerName="extract-content" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.289810 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e08994-05ef-40f2-903f-bb6450059e88" containerName="extract-content" Oct 06 13:07:55 crc kubenswrapper[4867]: E1006 13:07:55.289823 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e08994-05ef-40f2-903f-bb6450059e88" containerName="registry-server" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.289829 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e08994-05ef-40f2-903f-bb6450059e88" containerName="registry-server" Oct 06 13:07:55 crc kubenswrapper[4867]: E1006 13:07:55.289840 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3ccdb7-011e-4f41-9599-dab85489545e" containerName="registry-server" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.289846 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3ccdb7-011e-4f41-9599-dab85489545e" containerName="registry-server" Oct 06 13:07:55 crc kubenswrapper[4867]: E1006 13:07:55.289856 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3ccdb7-011e-4f41-9599-dab85489545e" containerName="extract-utilities" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.289863 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3ccdb7-011e-4f41-9599-dab85489545e" containerName="extract-utilities" Oct 06 13:07:55 crc kubenswrapper[4867]: E1006 13:07:55.289872 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7971c0f6-6d96-4e00-9bda-003c54ad5d53" containerName="extract-content" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.289879 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7971c0f6-6d96-4e00-9bda-003c54ad5d53" containerName="extract-content" Oct 06 13:07:55 crc kubenswrapper[4867]: E1006 13:07:55.289890 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2f5af5-a716-4878-ac8b-81c7636ffd7e" containerName="extract-utilities" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.289897 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2f5af5-a716-4878-ac8b-81c7636ffd7e" containerName="extract-utilities" Oct 06 13:07:55 crc kubenswrapper[4867]: E1006 13:07:55.289907 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d2f5af5-a716-4878-ac8b-81c7636ffd7e" containerName="extract-content" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.289914 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2f5af5-a716-4878-ac8b-81c7636ffd7e" containerName="extract-content" Oct 06 13:07:55 crc kubenswrapper[4867]: E1006 13:07:55.289923 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e1c41fc-3e0d-4048-97d2-54c54bc065e5" containerName="marketplace-operator" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.289931 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1c41fc-3e0d-4048-97d2-54c54bc065e5" containerName="marketplace-operator" Oct 06 13:07:55 crc kubenswrapper[4867]: E1006 13:07:55.289942 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7971c0f6-6d96-4e00-9bda-003c54ad5d53" containerName="extract-utilities" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.289950 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7971c0f6-6d96-4e00-9bda-003c54ad5d53" containerName="extract-utilities" Oct 06 13:07:55 crc kubenswrapper[4867]: E1006 13:07:55.289958 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7971c0f6-6d96-4e00-9bda-003c54ad5d53" containerName="registry-server" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.289965 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7971c0f6-6d96-4e00-9bda-003c54ad5d53" containerName="registry-server" Oct 06 13:07:55 crc kubenswrapper[4867]: E1006 13:07:55.289982 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3ccdb7-011e-4f41-9599-dab85489545e" containerName="extract-content" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.289989 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3ccdb7-011e-4f41-9599-dab85489545e" containerName="extract-content" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.290086 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3ccdb7-011e-4f41-9599-dab85489545e" containerName="registry-server" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.290101 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e1c41fc-3e0d-4048-97d2-54c54bc065e5" containerName="marketplace-operator" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.290112 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d2f5af5-a716-4878-ac8b-81c7636ffd7e" containerName="registry-server" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.290127 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="7971c0f6-6d96-4e00-9bda-003c54ad5d53" containerName="registry-server" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.290137 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e08994-05ef-40f2-903f-bb6450059e88" containerName="registry-server" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.290974 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.295616 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.300927 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-khr9q"] Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.310804 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3d6b459-d981-4cbe-8658-27860d930c81-catalog-content\") pod \"redhat-marketplace-khr9q\" (UID: \"a3d6b459-d981-4cbe-8658-27860d930c81\") " pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.310863 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s67v2\" (UniqueName: \"kubernetes.io/projected/a3d6b459-d981-4cbe-8658-27860d930c81-kube-api-access-s67v2\") pod \"redhat-marketplace-khr9q\" (UID: \"a3d6b459-d981-4cbe-8658-27860d930c81\") " pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.310924 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3d6b459-d981-4cbe-8658-27860d930c81-utilities\") pod \"redhat-marketplace-khr9q\" (UID: \"a3d6b459-d981-4cbe-8658-27860d930c81\") " pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.412240 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s67v2\" (UniqueName: \"kubernetes.io/projected/a3d6b459-d981-4cbe-8658-27860d930c81-kube-api-access-s67v2\") pod \"redhat-marketplace-khr9q\" (UID: \"a3d6b459-d981-4cbe-8658-27860d930c81\") " pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.412340 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3d6b459-d981-4cbe-8658-27860d930c81-utilities\") pod \"redhat-marketplace-khr9q\" (UID: \"a3d6b459-d981-4cbe-8658-27860d930c81\") " pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.412385 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3d6b459-d981-4cbe-8658-27860d930c81-catalog-content\") pod \"redhat-marketplace-khr9q\" (UID: \"a3d6b459-d981-4cbe-8658-27860d930c81\") " pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.412795 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3d6b459-d981-4cbe-8658-27860d930c81-catalog-content\") pod \"redhat-marketplace-khr9q\" (UID: \"a3d6b459-d981-4cbe-8658-27860d930c81\") " pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.412898 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3d6b459-d981-4cbe-8658-27860d930c81-utilities\") pod \"redhat-marketplace-khr9q\" (UID: \"a3d6b459-d981-4cbe-8658-27860d930c81\") " pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.430336 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s67v2\" (UniqueName: \"kubernetes.io/projected/a3d6b459-d981-4cbe-8658-27860d930c81-kube-api-access-s67v2\") pod \"redhat-marketplace-khr9q\" (UID: \"a3d6b459-d981-4cbe-8658-27860d930c81\") " pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.607868 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.676989 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rdmr9" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.833840 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-khr9q"] Oct 06 13:07:55 crc kubenswrapper[4867]: W1006 13:07:55.839673 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3d6b459_d981_4cbe_8658_27860d930c81.slice/crio-68abf0981883bc4d869334fded94579f45a2a2a6fccf41e28441efd81db891bb WatchSource:0}: Error finding container 68abf0981883bc4d869334fded94579f45a2a2a6fccf41e28441efd81db891bb: Status 404 returned error can't find the container with id 68abf0981883bc4d869334fded94579f45a2a2a6fccf41e28441efd81db891bb Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.890989 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6x2pl"] Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.892777 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.896328 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.901174 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6x2pl"] Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.920579 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d532425-fb08-45ce-81ae-4e1b31e099d3-catalog-content\") pod \"redhat-operators-6x2pl\" (UID: \"5d532425-fb08-45ce-81ae-4e1b31e099d3\") " pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.920900 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zvxl\" (UniqueName: \"kubernetes.io/projected/5d532425-fb08-45ce-81ae-4e1b31e099d3-kube-api-access-8zvxl\") pod \"redhat-operators-6x2pl\" (UID: \"5d532425-fb08-45ce-81ae-4e1b31e099d3\") " pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:07:55 crc kubenswrapper[4867]: I1006 13:07:55.921041 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d532425-fb08-45ce-81ae-4e1b31e099d3-utilities\") pod \"redhat-operators-6x2pl\" (UID: \"5d532425-fb08-45ce-81ae-4e1b31e099d3\") " pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:07:56 crc kubenswrapper[4867]: I1006 13:07:56.021869 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d532425-fb08-45ce-81ae-4e1b31e099d3-catalog-content\") pod \"redhat-operators-6x2pl\" (UID: \"5d532425-fb08-45ce-81ae-4e1b31e099d3\") " pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:07:56 crc kubenswrapper[4867]: I1006 13:07:56.021908 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zvxl\" (UniqueName: \"kubernetes.io/projected/5d532425-fb08-45ce-81ae-4e1b31e099d3-kube-api-access-8zvxl\") pod \"redhat-operators-6x2pl\" (UID: \"5d532425-fb08-45ce-81ae-4e1b31e099d3\") " pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:07:56 crc kubenswrapper[4867]: I1006 13:07:56.021947 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d532425-fb08-45ce-81ae-4e1b31e099d3-utilities\") pod \"redhat-operators-6x2pl\" (UID: \"5d532425-fb08-45ce-81ae-4e1b31e099d3\") " pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:07:56 crc kubenswrapper[4867]: I1006 13:07:56.022466 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d532425-fb08-45ce-81ae-4e1b31e099d3-catalog-content\") pod \"redhat-operators-6x2pl\" (UID: \"5d532425-fb08-45ce-81ae-4e1b31e099d3\") " pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:07:56 crc kubenswrapper[4867]: I1006 13:07:56.022634 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d532425-fb08-45ce-81ae-4e1b31e099d3-utilities\") pod \"redhat-operators-6x2pl\" (UID: \"5d532425-fb08-45ce-81ae-4e1b31e099d3\") " pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:07:56 crc kubenswrapper[4867]: I1006 13:07:56.041825 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zvxl\" (UniqueName: \"kubernetes.io/projected/5d532425-fb08-45ce-81ae-4e1b31e099d3-kube-api-access-8zvxl\") pod \"redhat-operators-6x2pl\" (UID: \"5d532425-fb08-45ce-81ae-4e1b31e099d3\") " pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:07:56 crc kubenswrapper[4867]: I1006 13:07:56.229633 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:07:56 crc kubenswrapper[4867]: I1006 13:07:56.403280 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6x2pl"] Oct 06 13:07:56 crc kubenswrapper[4867]: I1006 13:07:56.675047 4867 generic.go:334] "Generic (PLEG): container finished" podID="a3d6b459-d981-4cbe-8658-27860d930c81" containerID="3dd68097dbbde4e1f71dfdbb57986f81d5af2777fb65e97c256728165ee254dd" exitCode=0 Oct 06 13:07:56 crc kubenswrapper[4867]: I1006 13:07:56.675172 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khr9q" event={"ID":"a3d6b459-d981-4cbe-8658-27860d930c81","Type":"ContainerDied","Data":"3dd68097dbbde4e1f71dfdbb57986f81d5af2777fb65e97c256728165ee254dd"} Oct 06 13:07:56 crc kubenswrapper[4867]: I1006 13:07:56.675562 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khr9q" event={"ID":"a3d6b459-d981-4cbe-8658-27860d930c81","Type":"ContainerStarted","Data":"68abf0981883bc4d869334fded94579f45a2a2a6fccf41e28441efd81db891bb"} Oct 06 13:07:56 crc kubenswrapper[4867]: I1006 13:07:56.677413 4867 generic.go:334] "Generic (PLEG): container finished" podID="5d532425-fb08-45ce-81ae-4e1b31e099d3" containerID="01130d36df842e709beeb466d676d11d621e2eacf0af0dbd74c6e6f8f41add12" exitCode=0 Oct 06 13:07:56 crc kubenswrapper[4867]: I1006 13:07:56.677532 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2pl" event={"ID":"5d532425-fb08-45ce-81ae-4e1b31e099d3","Type":"ContainerDied","Data":"01130d36df842e709beeb466d676d11d621e2eacf0af0dbd74c6e6f8f41add12"} Oct 06 13:07:56 crc kubenswrapper[4867]: I1006 13:07:56.677566 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2pl" event={"ID":"5d532425-fb08-45ce-81ae-4e1b31e099d3","Type":"ContainerStarted","Data":"3bbda80afa0bb35469d1c37649f9a0e73c8bb2741e2b96f1c15664a1459a847e"} Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.690648 4867 generic.go:334] "Generic (PLEG): container finished" podID="a3d6b459-d981-4cbe-8658-27860d930c81" containerID="cb1016edc55a6f2184649007bfe9b4e823411f6042c89dff585c8aede13b5ea9" exitCode=0 Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.691551 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khr9q" event={"ID":"a3d6b459-d981-4cbe-8658-27860d930c81","Type":"ContainerDied","Data":"cb1016edc55a6f2184649007bfe9b4e823411f6042c89dff585c8aede13b5ea9"} Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.697165 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xxfqf"] Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.700350 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.706781 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.707054 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxfqf"] Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.846781 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab5b2af-998d-46de-8029-37bc95f7abb0-utilities\") pod \"certified-operators-xxfqf\" (UID: \"dab5b2af-998d-46de-8029-37bc95f7abb0\") " pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.846846 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnxp9\" (UniqueName: \"kubernetes.io/projected/dab5b2af-998d-46de-8029-37bc95f7abb0-kube-api-access-bnxp9\") pod \"certified-operators-xxfqf\" (UID: \"dab5b2af-998d-46de-8029-37bc95f7abb0\") " pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.846892 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab5b2af-998d-46de-8029-37bc95f7abb0-catalog-content\") pod \"certified-operators-xxfqf\" (UID: \"dab5b2af-998d-46de-8029-37bc95f7abb0\") " pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.948455 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab5b2af-998d-46de-8029-37bc95f7abb0-utilities\") pod \"certified-operators-xxfqf\" (UID: \"dab5b2af-998d-46de-8029-37bc95f7abb0\") " pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.948516 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnxp9\" (UniqueName: \"kubernetes.io/projected/dab5b2af-998d-46de-8029-37bc95f7abb0-kube-api-access-bnxp9\") pod \"certified-operators-xxfqf\" (UID: \"dab5b2af-998d-46de-8029-37bc95f7abb0\") " pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.948557 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab5b2af-998d-46de-8029-37bc95f7abb0-catalog-content\") pod \"certified-operators-xxfqf\" (UID: \"dab5b2af-998d-46de-8029-37bc95f7abb0\") " pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.948929 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab5b2af-998d-46de-8029-37bc95f7abb0-utilities\") pod \"certified-operators-xxfqf\" (UID: \"dab5b2af-998d-46de-8029-37bc95f7abb0\") " pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.948975 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab5b2af-998d-46de-8029-37bc95f7abb0-catalog-content\") pod \"certified-operators-xxfqf\" (UID: \"dab5b2af-998d-46de-8029-37bc95f7abb0\") " pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:07:57 crc kubenswrapper[4867]: I1006 13:07:57.970454 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnxp9\" (UniqueName: \"kubernetes.io/projected/dab5b2af-998d-46de-8029-37bc95f7abb0-kube-api-access-bnxp9\") pod \"certified-operators-xxfqf\" (UID: \"dab5b2af-998d-46de-8029-37bc95f7abb0\") " pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.078945 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.288724 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-29nnd"] Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.290162 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.293167 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.297226 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-29nnd"] Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.455836 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r6jr\" (UniqueName: \"kubernetes.io/projected/977ff4f6-ca6e-4ba8-8db3-3eb828510a13-kube-api-access-5r6jr\") pod \"community-operators-29nnd\" (UID: \"977ff4f6-ca6e-4ba8-8db3-3eb828510a13\") " pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.456577 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/977ff4f6-ca6e-4ba8-8db3-3eb828510a13-utilities\") pod \"community-operators-29nnd\" (UID: \"977ff4f6-ca6e-4ba8-8db3-3eb828510a13\") " pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.456759 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/977ff4f6-ca6e-4ba8-8db3-3eb828510a13-catalog-content\") pod \"community-operators-29nnd\" (UID: \"977ff4f6-ca6e-4ba8-8db3-3eb828510a13\") " pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.484636 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxfqf"] Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.558141 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r6jr\" (UniqueName: \"kubernetes.io/projected/977ff4f6-ca6e-4ba8-8db3-3eb828510a13-kube-api-access-5r6jr\") pod \"community-operators-29nnd\" (UID: \"977ff4f6-ca6e-4ba8-8db3-3eb828510a13\") " pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.558195 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/977ff4f6-ca6e-4ba8-8db3-3eb828510a13-utilities\") pod \"community-operators-29nnd\" (UID: \"977ff4f6-ca6e-4ba8-8db3-3eb828510a13\") " pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.558222 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/977ff4f6-ca6e-4ba8-8db3-3eb828510a13-catalog-content\") pod \"community-operators-29nnd\" (UID: \"977ff4f6-ca6e-4ba8-8db3-3eb828510a13\") " pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.558755 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/977ff4f6-ca6e-4ba8-8db3-3eb828510a13-utilities\") pod \"community-operators-29nnd\" (UID: \"977ff4f6-ca6e-4ba8-8db3-3eb828510a13\") " pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.558913 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/977ff4f6-ca6e-4ba8-8db3-3eb828510a13-catalog-content\") pod \"community-operators-29nnd\" (UID: \"977ff4f6-ca6e-4ba8-8db3-3eb828510a13\") " pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.578280 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r6jr\" (UniqueName: \"kubernetes.io/projected/977ff4f6-ca6e-4ba8-8db3-3eb828510a13-kube-api-access-5r6jr\") pod \"community-operators-29nnd\" (UID: \"977ff4f6-ca6e-4ba8-8db3-3eb828510a13\") " pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.665583 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.702380 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfqf" event={"ID":"dab5b2af-998d-46de-8029-37bc95f7abb0","Type":"ContainerStarted","Data":"3a1ec7c804860a67684024edd77a89e5c13d1800009e337fb9648e3f65fae010"} Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.702445 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfqf" event={"ID":"dab5b2af-998d-46de-8029-37bc95f7abb0","Type":"ContainerStarted","Data":"7325c242b5851f88b47c1304abc829a114dc829522d195189c3d96c346c0aead"} Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.716354 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khr9q" event={"ID":"a3d6b459-d981-4cbe-8658-27860d930c81","Type":"ContainerStarted","Data":"37c1ea05481106775a492e8949277185ae4a7b78301e54d8ac2ca449a42ab964"} Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.728351 4867 generic.go:334] "Generic (PLEG): container finished" podID="5d532425-fb08-45ce-81ae-4e1b31e099d3" containerID="ebe75cfb3522ae85dd13c13d4ab1c46c7c1cca6a1440e453e78ac42aa8e3d421" exitCode=0 Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.728409 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2pl" event={"ID":"5d532425-fb08-45ce-81ae-4e1b31e099d3","Type":"ContainerDied","Data":"ebe75cfb3522ae85dd13c13d4ab1c46c7c1cca6a1440e453e78ac42aa8e3d421"} Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.738128 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-khr9q" podStartSLOduration=2.218638548 podStartE2EDuration="3.738109127s" podCreationTimestamp="2025-10-06 13:07:55 +0000 UTC" firstStartedPulling="2025-10-06 13:07:56.676930018 +0000 UTC m=+256.134878162" lastFinishedPulling="2025-10-06 13:07:58.196400597 +0000 UTC m=+257.654348741" observedRunningTime="2025-10-06 13:07:58.737245213 +0000 UTC m=+258.195193377" watchObservedRunningTime="2025-10-06 13:07:58.738109127 +0000 UTC m=+258.196057261" Oct 06 13:07:58 crc kubenswrapper[4867]: I1006 13:07:58.862862 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-29nnd"] Oct 06 13:07:58 crc kubenswrapper[4867]: W1006 13:07:58.874313 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod977ff4f6_ca6e_4ba8_8db3_3eb828510a13.slice/crio-5feb827d808f31d0580c435d679d52252ed171d7be73ec61bdc9957d83b6d176 WatchSource:0}: Error finding container 5feb827d808f31d0580c435d679d52252ed171d7be73ec61bdc9957d83b6d176: Status 404 returned error can't find the container with id 5feb827d808f31d0580c435d679d52252ed171d7be73ec61bdc9957d83b6d176 Oct 06 13:07:59 crc kubenswrapper[4867]: I1006 13:07:59.734515 4867 generic.go:334] "Generic (PLEG): container finished" podID="977ff4f6-ca6e-4ba8-8db3-3eb828510a13" containerID="becaf8cb187e479ead44590617d6dd9134130a9a02719c892dfbfb972be4aa78" exitCode=0 Oct 06 13:07:59 crc kubenswrapper[4867]: I1006 13:07:59.734761 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29nnd" event={"ID":"977ff4f6-ca6e-4ba8-8db3-3eb828510a13","Type":"ContainerDied","Data":"becaf8cb187e479ead44590617d6dd9134130a9a02719c892dfbfb972be4aa78"} Oct 06 13:07:59 crc kubenswrapper[4867]: I1006 13:07:59.735861 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29nnd" event={"ID":"977ff4f6-ca6e-4ba8-8db3-3eb828510a13","Type":"ContainerStarted","Data":"5feb827d808f31d0580c435d679d52252ed171d7be73ec61bdc9957d83b6d176"} Oct 06 13:07:59 crc kubenswrapper[4867]: I1006 13:07:59.739936 4867 generic.go:334] "Generic (PLEG): container finished" podID="dab5b2af-998d-46de-8029-37bc95f7abb0" containerID="3a1ec7c804860a67684024edd77a89e5c13d1800009e337fb9648e3f65fae010" exitCode=0 Oct 06 13:07:59 crc kubenswrapper[4867]: I1006 13:07:59.740050 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfqf" event={"ID":"dab5b2af-998d-46de-8029-37bc95f7abb0","Type":"ContainerDied","Data":"3a1ec7c804860a67684024edd77a89e5c13d1800009e337fb9648e3f65fae010"} Oct 06 13:07:59 crc kubenswrapper[4867]: I1006 13:07:59.740134 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfqf" event={"ID":"dab5b2af-998d-46de-8029-37bc95f7abb0","Type":"ContainerStarted","Data":"7862d997fa8da0f8166b2a77cacd05f1f50dbf6ce63755c3369827df36882f22"} Oct 06 13:07:59 crc kubenswrapper[4867]: I1006 13:07:59.750674 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2pl" event={"ID":"5d532425-fb08-45ce-81ae-4e1b31e099d3","Type":"ContainerStarted","Data":"87a2e38eeb4ffdb690db4a958b969fe3f826a8e223f62bcbcfab05c026e5c7ed"} Oct 06 13:07:59 crc kubenswrapper[4867]: I1006 13:07:59.781348 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6x2pl" podStartSLOduration=2.022389004 podStartE2EDuration="4.781330726s" podCreationTimestamp="2025-10-06 13:07:55 +0000 UTC" firstStartedPulling="2025-10-06 13:07:56.678999125 +0000 UTC m=+256.136947269" lastFinishedPulling="2025-10-06 13:07:59.437940847 +0000 UTC m=+258.895888991" observedRunningTime="2025-10-06 13:07:59.778905379 +0000 UTC m=+259.236853523" watchObservedRunningTime="2025-10-06 13:07:59.781330726 +0000 UTC m=+259.239278870" Oct 06 13:08:00 crc kubenswrapper[4867]: I1006 13:08:00.757382 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29nnd" event={"ID":"977ff4f6-ca6e-4ba8-8db3-3eb828510a13","Type":"ContainerStarted","Data":"37ed43b040f8447063c326d466267b9a0e4dd6afcd3e7131c5356d537a29ff20"} Oct 06 13:08:00 crc kubenswrapper[4867]: I1006 13:08:00.761844 4867 generic.go:334] "Generic (PLEG): container finished" podID="dab5b2af-998d-46de-8029-37bc95f7abb0" containerID="7862d997fa8da0f8166b2a77cacd05f1f50dbf6ce63755c3369827df36882f22" exitCode=0 Oct 06 13:08:00 crc kubenswrapper[4867]: I1006 13:08:00.761927 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfqf" event={"ID":"dab5b2af-998d-46de-8029-37bc95f7abb0","Type":"ContainerDied","Data":"7862d997fa8da0f8166b2a77cacd05f1f50dbf6ce63755c3369827df36882f22"} Oct 06 13:08:01 crc kubenswrapper[4867]: I1006 13:08:01.770084 4867 generic.go:334] "Generic (PLEG): container finished" podID="977ff4f6-ca6e-4ba8-8db3-3eb828510a13" containerID="37ed43b040f8447063c326d466267b9a0e4dd6afcd3e7131c5356d537a29ff20" exitCode=0 Oct 06 13:08:01 crc kubenswrapper[4867]: I1006 13:08:01.770170 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29nnd" event={"ID":"977ff4f6-ca6e-4ba8-8db3-3eb828510a13","Type":"ContainerDied","Data":"37ed43b040f8447063c326d466267b9a0e4dd6afcd3e7131c5356d537a29ff20"} Oct 06 13:08:02 crc kubenswrapper[4867]: I1006 13:08:02.779231 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfqf" event={"ID":"dab5b2af-998d-46de-8029-37bc95f7abb0","Type":"ContainerStarted","Data":"1e0852cdb34741a73225e3e0df387537eed8688f70d84ffc44353bd7ef523c91"} Oct 06 13:08:02 crc kubenswrapper[4867]: I1006 13:08:02.782568 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29nnd" event={"ID":"977ff4f6-ca6e-4ba8-8db3-3eb828510a13","Type":"ContainerStarted","Data":"31b64beb6725b22886afc908423c00dc0fac70cefd3b0c3ace1815968082affb"} Oct 06 13:08:02 crc kubenswrapper[4867]: I1006 13:08:02.804291 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xxfqf" podStartSLOduration=3.207254926 podStartE2EDuration="5.804268663s" podCreationTimestamp="2025-10-06 13:07:57 +0000 UTC" firstStartedPulling="2025-10-06 13:07:58.706998458 +0000 UTC m=+258.164946612" lastFinishedPulling="2025-10-06 13:08:01.304012205 +0000 UTC m=+260.761960349" observedRunningTime="2025-10-06 13:08:02.801731403 +0000 UTC m=+262.259679567" watchObservedRunningTime="2025-10-06 13:08:02.804268663 +0000 UTC m=+262.262216807" Oct 06 13:08:02 crc kubenswrapper[4867]: I1006 13:08:02.824849 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-29nnd" podStartSLOduration=3.31330551 podStartE2EDuration="4.824826601s" podCreationTimestamp="2025-10-06 13:07:58 +0000 UTC" firstStartedPulling="2025-10-06 13:07:59.73659914 +0000 UTC m=+259.194547284" lastFinishedPulling="2025-10-06 13:08:01.248120241 +0000 UTC m=+260.706068375" observedRunningTime="2025-10-06 13:08:02.822911209 +0000 UTC m=+262.280859363" watchObservedRunningTime="2025-10-06 13:08:02.824826601 +0000 UTC m=+262.282774755" Oct 06 13:08:05 crc kubenswrapper[4867]: I1006 13:08:05.609001 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:08:05 crc kubenswrapper[4867]: I1006 13:08:05.610927 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:08:05 crc kubenswrapper[4867]: I1006 13:08:05.672519 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:08:05 crc kubenswrapper[4867]: I1006 13:08:05.840635 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-khr9q" Oct 06 13:08:06 crc kubenswrapper[4867]: I1006 13:08:06.230644 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:08:06 crc kubenswrapper[4867]: I1006 13:08:06.231136 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:08:06 crc kubenswrapper[4867]: I1006 13:08:06.270400 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:08:06 crc kubenswrapper[4867]: I1006 13:08:06.853455 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 13:08:08 crc kubenswrapper[4867]: I1006 13:08:08.079919 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:08:08 crc kubenswrapper[4867]: I1006 13:08:08.080244 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:08:08 crc kubenswrapper[4867]: I1006 13:08:08.119344 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:08:08 crc kubenswrapper[4867]: I1006 13:08:08.667042 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:08:08 crc kubenswrapper[4867]: I1006 13:08:08.667281 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:08:08 crc kubenswrapper[4867]: I1006 13:08:08.703411 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:08:08 crc kubenswrapper[4867]: I1006 13:08:08.853419 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 13:08:08 crc kubenswrapper[4867]: I1006 13:08:08.880439 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-29nnd" Oct 06 13:09:42 crc kubenswrapper[4867]: I1006 13:09:42.873939 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:09:42 crc kubenswrapper[4867]: I1006 13:09:42.875081 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:10:12 crc kubenswrapper[4867]: I1006 13:10:12.874281 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:10:12 crc kubenswrapper[4867]: I1006 13:10:12.874864 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:10:42 crc kubenswrapper[4867]: I1006 13:10:42.873160 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:10:42 crc kubenswrapper[4867]: I1006 13:10:42.873787 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:10:42 crc kubenswrapper[4867]: I1006 13:10:42.873839 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:10:42 crc kubenswrapper[4867]: I1006 13:10:42.874542 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7ebc66b3368265481d7ce17a498c1898a6fa0d78101df6ffd71b9d951872175a"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:10:42 crc kubenswrapper[4867]: I1006 13:10:42.875100 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://7ebc66b3368265481d7ce17a498c1898a6fa0d78101df6ffd71b9d951872175a" gracePeriod=600 Oct 06 13:10:43 crc kubenswrapper[4867]: I1006 13:10:43.734298 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="7ebc66b3368265481d7ce17a498c1898a6fa0d78101df6ffd71b9d951872175a" exitCode=0 Oct 06 13:10:43 crc kubenswrapper[4867]: I1006 13:10:43.734376 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"7ebc66b3368265481d7ce17a498c1898a6fa0d78101df6ffd71b9d951872175a"} Oct 06 13:10:43 crc kubenswrapper[4867]: I1006 13:10:43.734658 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"9e6c0a282c79916c755a5288bb4eb5014e330cf8dede8b87679a8dc2b50be474"} Oct 06 13:10:43 crc kubenswrapper[4867]: I1006 13:10:43.734685 4867 scope.go:117] "RemoveContainer" containerID="c743f36a8ceef5d3d5a92230f81d9782b2bb90654508a88bc8305f83f6631ab5" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.004128 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6kpt2"] Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.005361 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.098318 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6kpt2"] Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.176746 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.176817 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-registry-tls\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.176878 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.176941 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d77wk\" (UniqueName: \"kubernetes.io/projected/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-kube-api-access-d77wk\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.176972 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-bound-sa-token\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.176997 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-registry-certificates\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.177024 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.177086 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-trusted-ca\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.197444 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.278232 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-trusted-ca\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.278309 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.278342 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-registry-tls\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.278402 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d77wk\" (UniqueName: \"kubernetes.io/projected/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-kube-api-access-d77wk\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.278430 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-bound-sa-token\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.278452 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-registry-certificates\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.278474 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.278846 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.280157 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-registry-certificates\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.280216 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-trusted-ca\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.284694 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.287767 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-registry-tls\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.300860 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-bound-sa-token\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.300974 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d77wk\" (UniqueName: \"kubernetes.io/projected/6b4742b2-28db-4cf4-8d96-77fb04b8c5c1-kube-api-access-d77wk\") pod \"image-registry-66df7c8f76-6kpt2\" (UID: \"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1\") " pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.323779 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:33 crc kubenswrapper[4867]: I1006 13:11:33.508760 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6kpt2"] Oct 06 13:11:34 crc kubenswrapper[4867]: I1006 13:11:34.009045 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" event={"ID":"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1","Type":"ContainerStarted","Data":"74cfd0b29692c8289b2921ebaf31373283b4bc314b8ae6d4d919eaf80aecaed4"} Oct 06 13:11:34 crc kubenswrapper[4867]: I1006 13:11:34.009387 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:34 crc kubenswrapper[4867]: I1006 13:11:34.009399 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" event={"ID":"6b4742b2-28db-4cf4-8d96-77fb04b8c5c1","Type":"ContainerStarted","Data":"46089655703c70785252ba54e7f637c7c7ef2cad0ac8776c7d6aec8393a19401"} Oct 06 13:11:53 crc kubenswrapper[4867]: I1006 13:11:53.327957 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" Oct 06 13:11:53 crc kubenswrapper[4867]: I1006 13:11:53.352787 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-6kpt2" podStartSLOduration=21.352752087 podStartE2EDuration="21.352752087s" podCreationTimestamp="2025-10-06 13:11:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:11:34.027633527 +0000 UTC m=+473.485581691" watchObservedRunningTime="2025-10-06 13:11:53.352752087 +0000 UTC m=+492.810700271" Oct 06 13:11:53 crc kubenswrapper[4867]: I1006 13:11:53.387744 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6h9ff"] Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.429772 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" podUID="f59db107-9767-4161-83f0-09f15ba1d881" containerName="registry" containerID="cri-o://381c14d5ecff074040321b307677fa94bc9902b1f3f3a74f731be0c62b457b17" gracePeriod=30 Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.789427 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.963754 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f59db107-9767-4161-83f0-09f15ba1d881\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.963801 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkntm\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-kube-api-access-qkntm\") pod \"f59db107-9767-4161-83f0-09f15ba1d881\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.963832 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f59db107-9767-4161-83f0-09f15ba1d881-ca-trust-extracted\") pod \"f59db107-9767-4161-83f0-09f15ba1d881\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.963860 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-registry-tls\") pod \"f59db107-9767-4161-83f0-09f15ba1d881\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.963881 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f59db107-9767-4161-83f0-09f15ba1d881-trusted-ca\") pod \"f59db107-9767-4161-83f0-09f15ba1d881\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.963905 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f59db107-9767-4161-83f0-09f15ba1d881-registry-certificates\") pod \"f59db107-9767-4161-83f0-09f15ba1d881\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.963930 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-bound-sa-token\") pod \"f59db107-9767-4161-83f0-09f15ba1d881\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.963958 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f59db107-9767-4161-83f0-09f15ba1d881-installation-pull-secrets\") pod \"f59db107-9767-4161-83f0-09f15ba1d881\" (UID: \"f59db107-9767-4161-83f0-09f15ba1d881\") " Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.964892 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59db107-9767-4161-83f0-09f15ba1d881-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f59db107-9767-4161-83f0-09f15ba1d881" (UID: "f59db107-9767-4161-83f0-09f15ba1d881"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.964931 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59db107-9767-4161-83f0-09f15ba1d881-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f59db107-9767-4161-83f0-09f15ba1d881" (UID: "f59db107-9767-4161-83f0-09f15ba1d881"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.970118 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-kube-api-access-qkntm" (OuterVolumeSpecName: "kube-api-access-qkntm") pod "f59db107-9767-4161-83f0-09f15ba1d881" (UID: "f59db107-9767-4161-83f0-09f15ba1d881"). InnerVolumeSpecName "kube-api-access-qkntm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.970631 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f59db107-9767-4161-83f0-09f15ba1d881" (UID: "f59db107-9767-4161-83f0-09f15ba1d881"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.971373 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f59db107-9767-4161-83f0-09f15ba1d881-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f59db107-9767-4161-83f0-09f15ba1d881" (UID: "f59db107-9767-4161-83f0-09f15ba1d881"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.971863 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f59db107-9767-4161-83f0-09f15ba1d881" (UID: "f59db107-9767-4161-83f0-09f15ba1d881"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.985529 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f59db107-9767-4161-83f0-09f15ba1d881-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f59db107-9767-4161-83f0-09f15ba1d881" (UID: "f59db107-9767-4161-83f0-09f15ba1d881"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:12:18 crc kubenswrapper[4867]: I1006 13:12:18.990143 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f59db107-9767-4161-83f0-09f15ba1d881" (UID: "f59db107-9767-4161-83f0-09f15ba1d881"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.065186 4867 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.065234 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f59db107-9767-4161-83f0-09f15ba1d881-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.065244 4867 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f59db107-9767-4161-83f0-09f15ba1d881-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.065290 4867 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.065299 4867 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f59db107-9767-4161-83f0-09f15ba1d881-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.065308 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkntm\" (UniqueName: \"kubernetes.io/projected/f59db107-9767-4161-83f0-09f15ba1d881-kube-api-access-qkntm\") on node \"crc\" DevicePath \"\"" Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.065316 4867 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f59db107-9767-4161-83f0-09f15ba1d881-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.279376 4867 generic.go:334] "Generic (PLEG): container finished" podID="f59db107-9767-4161-83f0-09f15ba1d881" containerID="381c14d5ecff074040321b307677fa94bc9902b1f3f3a74f731be0c62b457b17" exitCode=0 Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.279490 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.279494 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" event={"ID":"f59db107-9767-4161-83f0-09f15ba1d881","Type":"ContainerDied","Data":"381c14d5ecff074040321b307677fa94bc9902b1f3f3a74f731be0c62b457b17"} Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.279677 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6h9ff" event={"ID":"f59db107-9767-4161-83f0-09f15ba1d881","Type":"ContainerDied","Data":"3f315cf3fa3fd694b6531a4cd862624df2949c49733127b4632af12d4e577d9c"} Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.279730 4867 scope.go:117] "RemoveContainer" containerID="381c14d5ecff074040321b307677fa94bc9902b1f3f3a74f731be0c62b457b17" Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.298488 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6h9ff"] Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.302502 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6h9ff"] Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.307149 4867 scope.go:117] "RemoveContainer" containerID="381c14d5ecff074040321b307677fa94bc9902b1f3f3a74f731be0c62b457b17" Oct 06 13:12:19 crc kubenswrapper[4867]: E1006 13:12:19.307832 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"381c14d5ecff074040321b307677fa94bc9902b1f3f3a74f731be0c62b457b17\": container with ID starting with 381c14d5ecff074040321b307677fa94bc9902b1f3f3a74f731be0c62b457b17 not found: ID does not exist" containerID="381c14d5ecff074040321b307677fa94bc9902b1f3f3a74f731be0c62b457b17" Oct 06 13:12:19 crc kubenswrapper[4867]: I1006 13:12:19.307917 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381c14d5ecff074040321b307677fa94bc9902b1f3f3a74f731be0c62b457b17"} err="failed to get container status \"381c14d5ecff074040321b307677fa94bc9902b1f3f3a74f731be0c62b457b17\": rpc error: code = NotFound desc = could not find container \"381c14d5ecff074040321b307677fa94bc9902b1f3f3a74f731be0c62b457b17\": container with ID starting with 381c14d5ecff074040321b307677fa94bc9902b1f3f3a74f731be0c62b457b17 not found: ID does not exist" Oct 06 13:12:21 crc kubenswrapper[4867]: I1006 13:12:21.238787 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f59db107-9767-4161-83f0-09f15ba1d881" path="/var/lib/kubelet/pods/f59db107-9767-4161-83f0-09f15ba1d881/volumes" Oct 06 13:12:41 crc kubenswrapper[4867]: I1006 13:12:41.356012 4867 scope.go:117] "RemoveContainer" containerID="eff23dc662105c5333071fa89927258b06bbb2f3d97487fef7d0dbec738a61c9" Oct 06 13:13:12 crc kubenswrapper[4867]: I1006 13:13:12.874102 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:13:12 crc kubenswrapper[4867]: I1006 13:13:12.875069 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.550729 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-h95bg"] Oct 06 13:13:17 crc kubenswrapper[4867]: E1006 13:13:17.551365 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f59db107-9767-4161-83f0-09f15ba1d881" containerName="registry" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.551380 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f59db107-9767-4161-83f0-09f15ba1d881" containerName="registry" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.551493 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f59db107-9767-4161-83f0-09f15ba1d881" containerName="registry" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.551913 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-h95bg" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.553475 4867 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-5vwxs" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.554181 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.554483 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.554478 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-tlbns"] Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.555323 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-tlbns" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.560471 4867 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-btzfm" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.562021 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-h95bg"] Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.596233 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-tlbns"] Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.611768 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-rxv42"] Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.612657 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-rxv42" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.616162 4867 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-dnt5j" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.621881 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-rxv42"] Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.720212 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sslq\" (UniqueName: \"kubernetes.io/projected/ed16aef2-69b2-443d-8d5d-c2122dd5b373-kube-api-access-7sslq\") pod \"cert-manager-5b446d88c5-tlbns\" (UID: \"ed16aef2-69b2-443d-8d5d-c2122dd5b373\") " pod="cert-manager/cert-manager-5b446d88c5-tlbns" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.720603 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcpt7\" (UniqueName: \"kubernetes.io/projected/e5b18647-65b8-4ed4-bf88-542c6c583588-kube-api-access-tcpt7\") pod \"cert-manager-webhook-5655c58dd6-rxv42\" (UID: \"e5b18647-65b8-4ed4-bf88-542c6c583588\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-rxv42" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.720745 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm6wf\" (UniqueName: \"kubernetes.io/projected/29b819dc-d3f7-449d-812a-9a76c1d02046-kube-api-access-dm6wf\") pod \"cert-manager-cainjector-7f985d654d-h95bg\" (UID: \"29b819dc-d3f7-449d-812a-9a76c1d02046\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-h95bg" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.821445 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sslq\" (UniqueName: \"kubernetes.io/projected/ed16aef2-69b2-443d-8d5d-c2122dd5b373-kube-api-access-7sslq\") pod \"cert-manager-5b446d88c5-tlbns\" (UID: \"ed16aef2-69b2-443d-8d5d-c2122dd5b373\") " pod="cert-manager/cert-manager-5b446d88c5-tlbns" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.821517 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcpt7\" (UniqueName: \"kubernetes.io/projected/e5b18647-65b8-4ed4-bf88-542c6c583588-kube-api-access-tcpt7\") pod \"cert-manager-webhook-5655c58dd6-rxv42\" (UID: \"e5b18647-65b8-4ed4-bf88-542c6c583588\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-rxv42" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.821579 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm6wf\" (UniqueName: \"kubernetes.io/projected/29b819dc-d3f7-449d-812a-9a76c1d02046-kube-api-access-dm6wf\") pod \"cert-manager-cainjector-7f985d654d-h95bg\" (UID: \"29b819dc-d3f7-449d-812a-9a76c1d02046\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-h95bg" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.839455 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm6wf\" (UniqueName: \"kubernetes.io/projected/29b819dc-d3f7-449d-812a-9a76c1d02046-kube-api-access-dm6wf\") pod \"cert-manager-cainjector-7f985d654d-h95bg\" (UID: \"29b819dc-d3f7-449d-812a-9a76c1d02046\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-h95bg" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.839565 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcpt7\" (UniqueName: \"kubernetes.io/projected/e5b18647-65b8-4ed4-bf88-542c6c583588-kube-api-access-tcpt7\") pod \"cert-manager-webhook-5655c58dd6-rxv42\" (UID: \"e5b18647-65b8-4ed4-bf88-542c6c583588\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-rxv42" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.852281 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sslq\" (UniqueName: \"kubernetes.io/projected/ed16aef2-69b2-443d-8d5d-c2122dd5b373-kube-api-access-7sslq\") pod \"cert-manager-5b446d88c5-tlbns\" (UID: \"ed16aef2-69b2-443d-8d5d-c2122dd5b373\") " pod="cert-manager/cert-manager-5b446d88c5-tlbns" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.872594 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-h95bg" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.880385 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-tlbns" Oct 06 13:13:17 crc kubenswrapper[4867]: I1006 13:13:17.927096 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-rxv42" Oct 06 13:13:18 crc kubenswrapper[4867]: I1006 13:13:18.235197 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-rxv42"] Oct 06 13:13:18 crc kubenswrapper[4867]: W1006 13:13:18.244046 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5b18647_65b8_4ed4_bf88_542c6c583588.slice/crio-4b999ffddb9c5c3b2fe644587d530b21abeb6b32ded13d588661810aa43769e8 WatchSource:0}: Error finding container 4b999ffddb9c5c3b2fe644587d530b21abeb6b32ded13d588661810aa43769e8: Status 404 returned error can't find the container with id 4b999ffddb9c5c3b2fe644587d530b21abeb6b32ded13d588661810aa43769e8 Oct 06 13:13:18 crc kubenswrapper[4867]: I1006 13:13:18.247132 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:13:18 crc kubenswrapper[4867]: I1006 13:13:18.358959 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-tlbns"] Oct 06 13:13:18 crc kubenswrapper[4867]: W1006 13:13:18.361411 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded16aef2_69b2_443d_8d5d_c2122dd5b373.slice/crio-3576e72e0acfe5ed3978e6f36496a1f17339d783195ee3b2e8beb0edf64c103b WatchSource:0}: Error finding container 3576e72e0acfe5ed3978e6f36496a1f17339d783195ee3b2e8beb0edf64c103b: Status 404 returned error can't find the container with id 3576e72e0acfe5ed3978e6f36496a1f17339d783195ee3b2e8beb0edf64c103b Oct 06 13:13:18 crc kubenswrapper[4867]: I1006 13:13:18.369304 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-h95bg"] Oct 06 13:13:18 crc kubenswrapper[4867]: I1006 13:13:18.664126 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-h95bg" event={"ID":"29b819dc-d3f7-449d-812a-9a76c1d02046","Type":"ContainerStarted","Data":"b0d19300bd054acb947afb9504ea0704d0c6e357329e9ba73e152a335bfe879f"} Oct 06 13:13:18 crc kubenswrapper[4867]: I1006 13:13:18.665187 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-rxv42" event={"ID":"e5b18647-65b8-4ed4-bf88-542c6c583588","Type":"ContainerStarted","Data":"4b999ffddb9c5c3b2fe644587d530b21abeb6b32ded13d588661810aa43769e8"} Oct 06 13:13:18 crc kubenswrapper[4867]: I1006 13:13:18.666406 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-tlbns" event={"ID":"ed16aef2-69b2-443d-8d5d-c2122dd5b373","Type":"ContainerStarted","Data":"3576e72e0acfe5ed3978e6f36496a1f17339d783195ee3b2e8beb0edf64c103b"} Oct 06 13:13:20 crc kubenswrapper[4867]: I1006 13:13:20.682828 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-rxv42" event={"ID":"e5b18647-65b8-4ed4-bf88-542c6c583588","Type":"ContainerStarted","Data":"99ce5e5f218ed61f16b3f7dc4f273ea10adf358e99b1690ee1d6c94c9c9d068e"} Oct 06 13:13:20 crc kubenswrapper[4867]: I1006 13:13:20.684569 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-rxv42" Oct 06 13:13:20 crc kubenswrapper[4867]: I1006 13:13:20.707169 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-rxv42" podStartSLOduration=1.756573926 podStartE2EDuration="3.707142152s" podCreationTimestamp="2025-10-06 13:13:17 +0000 UTC" firstStartedPulling="2025-10-06 13:13:18.246850413 +0000 UTC m=+577.704798567" lastFinishedPulling="2025-10-06 13:13:20.197418619 +0000 UTC m=+579.655366793" observedRunningTime="2025-10-06 13:13:20.700805969 +0000 UTC m=+580.158754123" watchObservedRunningTime="2025-10-06 13:13:20.707142152 +0000 UTC m=+580.165090296" Oct 06 13:13:22 crc kubenswrapper[4867]: I1006 13:13:22.698745 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-tlbns" event={"ID":"ed16aef2-69b2-443d-8d5d-c2122dd5b373","Type":"ContainerStarted","Data":"c8f4a6679bcf6d619bdeef3cfbfec6be6c663435f9a9f34946e5441df60389c0"} Oct 06 13:13:22 crc kubenswrapper[4867]: I1006 13:13:22.700337 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-h95bg" event={"ID":"29b819dc-d3f7-449d-812a-9a76c1d02046","Type":"ContainerStarted","Data":"f5c644b8c231153bb8e49ca676b6fabfd8ca4cbc8bef7e82aebb125fb9144bf7"} Oct 06 13:13:22 crc kubenswrapper[4867]: I1006 13:13:22.718189 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-tlbns" podStartSLOduration=2.299257574 podStartE2EDuration="5.718166363s" podCreationTimestamp="2025-10-06 13:13:17 +0000 UTC" firstStartedPulling="2025-10-06 13:13:18.364632086 +0000 UTC m=+577.822580240" lastFinishedPulling="2025-10-06 13:13:21.783540885 +0000 UTC m=+581.241489029" observedRunningTime="2025-10-06 13:13:22.712578511 +0000 UTC m=+582.170526665" watchObservedRunningTime="2025-10-06 13:13:22.718166363 +0000 UTC m=+582.176114507" Oct 06 13:13:27 crc kubenswrapper[4867]: I1006 13:13:27.932419 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-rxv42" Oct 06 13:13:27 crc kubenswrapper[4867]: I1006 13:13:27.959483 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-h95bg" podStartSLOduration=7.616422666 podStartE2EDuration="10.959459632s" podCreationTimestamp="2025-10-06 13:13:17 +0000 UTC" firstStartedPulling="2025-10-06 13:13:18.375558933 +0000 UTC m=+577.833507077" lastFinishedPulling="2025-10-06 13:13:21.718595899 +0000 UTC m=+581.176544043" observedRunningTime="2025-10-06 13:13:22.73056775 +0000 UTC m=+582.188515894" watchObservedRunningTime="2025-10-06 13:13:27.959459632 +0000 UTC m=+587.417407776" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.242106 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zlc7z"] Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.242901 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovn-controller" containerID="cri-o://e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315" gracePeriod=30 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.242941 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="nbdb" containerID="cri-o://ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d" gracePeriod=30 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.243001 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="kube-rbac-proxy-node" containerID="cri-o://0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7" gracePeriod=30 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.243018 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7" gracePeriod=30 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.243122 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovn-acl-logging" containerID="cri-o://57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8" gracePeriod=30 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.243073 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="northd" containerID="cri-o://848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3" gracePeriod=30 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.243177 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="sbdb" containerID="cri-o://7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9" gracePeriod=30 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.278597 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" containerID="cri-o://d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69" gracePeriod=30 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.634436 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/3.log" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.637470 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovn-acl-logging/0.log" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.638235 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovn-controller/0.log" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.638856 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.713526 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x5qw7"] Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.713761 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovn-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.713775 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovn-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.713788 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.713794 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.713804 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="sbdb" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.713811 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="sbdb" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.713819 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.713825 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.713833 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovn-acl-logging" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.713839 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovn-acl-logging" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.713847 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="northd" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.713852 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="northd" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.713863 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="kube-rbac-proxy-node" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.713871 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="kube-rbac-proxy-node" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.713882 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="kubecfg-setup" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.713890 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="kubecfg-setup" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.713899 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.713906 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.713913 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.713920 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.713930 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="nbdb" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.713937 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="nbdb" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.714019 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="sbdb" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.714029 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.714035 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.714044 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="northd" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.714051 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="kube-rbac-proxy-node" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.714061 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovn-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.714068 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.714074 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.714080 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovn-acl-logging" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.714087 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="nbdb" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.714174 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.714203 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.714219 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.714226 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.714327 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.714494 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" containerName="ovnkube-controller" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.716996 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.749932 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-knnfm_8e3bebeb-f8c1-4b1e-a320-b937eced1c3a/kube-multus/2.log" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.750564 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-knnfm_8e3bebeb-f8c1-4b1e-a320-b937eced1c3a/kube-multus/1.log" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.750637 4867 generic.go:334] "Generic (PLEG): container finished" podID="8e3bebeb-f8c1-4b1e-a320-b937eced1c3a" containerID="069d91feced47fa7dd985fd0691e86c74dc903221691f3caea33965e465d529f" exitCode=2 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.750723 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-knnfm" event={"ID":"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a","Type":"ContainerDied","Data":"069d91feced47fa7dd985fd0691e86c74dc903221691f3caea33965e465d529f"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.750794 4867 scope.go:117] "RemoveContainer" containerID="5db25e15e7d1c63ff17fb8979f10d3d86b38b65942e72251cacf977ff9d031b4" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.751857 4867 scope.go:117] "RemoveContainer" containerID="069d91feced47fa7dd985fd0691e86c74dc903221691f3caea33965e465d529f" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.752290 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-knnfm_openshift-multus(8e3bebeb-f8c1-4b1e-a320-b937eced1c3a)\"" pod="openshift-multus/multus-knnfm" podUID="8e3bebeb-f8c1-4b1e-a320-b937eced1c3a" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.756764 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovnkube-controller/3.log" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.760773 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovn-acl-logging/0.log" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.761350 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlc7z_93569a52-4f36-4017-9834-b3651d6cd63e/ovn-controller/0.log" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.761874 4867 generic.go:334] "Generic (PLEG): container finished" podID="93569a52-4f36-4017-9834-b3651d6cd63e" containerID="d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69" exitCode=0 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.761910 4867 generic.go:334] "Generic (PLEG): container finished" podID="93569a52-4f36-4017-9834-b3651d6cd63e" containerID="7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9" exitCode=0 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.761920 4867 generic.go:334] "Generic (PLEG): container finished" podID="93569a52-4f36-4017-9834-b3651d6cd63e" containerID="ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d" exitCode=0 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.761931 4867 generic.go:334] "Generic (PLEG): container finished" podID="93569a52-4f36-4017-9834-b3651d6cd63e" containerID="848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3" exitCode=0 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.761942 4867 generic.go:334] "Generic (PLEG): container finished" podID="93569a52-4f36-4017-9834-b3651d6cd63e" containerID="17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7" exitCode=0 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.761953 4867 generic.go:334] "Generic (PLEG): container finished" podID="93569a52-4f36-4017-9834-b3651d6cd63e" containerID="0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7" exitCode=0 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.761961 4867 generic.go:334] "Generic (PLEG): container finished" podID="93569a52-4f36-4017-9834-b3651d6cd63e" containerID="57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8" exitCode=143 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.761973 4867 generic.go:334] "Generic (PLEG): container finished" podID="93569a52-4f36-4017-9834-b3651d6cd63e" containerID="e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315" exitCode=143 Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762002 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerDied","Data":"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762046 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerDied","Data":"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762066 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerDied","Data":"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762082 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerDied","Data":"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762100 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerDied","Data":"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762116 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerDied","Data":"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762133 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762147 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762155 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762163 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762171 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762181 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762190 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762199 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762225 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762233 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762245 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerDied","Data":"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762279 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762288 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762296 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762304 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762312 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762320 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762327 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762335 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762342 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762349 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762360 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerDied","Data":"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762372 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762382 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762389 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762396 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762403 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762410 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762417 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762425 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762431 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762439 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762448 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" event={"ID":"93569a52-4f36-4017-9834-b3651d6cd63e","Type":"ContainerDied","Data":"e98a11e9ef63972ce1b8ba9d81d42ae322f95e27805595687b5d3c76065279c8"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762458 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762466 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762474 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762482 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762490 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762499 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762522 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762530 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762538 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762544 4867 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38"} Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.762657 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zlc7z" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.786978 4867 scope.go:117] "RemoveContainer" containerID="d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.804973 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-etc-openvswitch\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.805442 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-systemd-units\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.805181 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.805476 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-env-overrides\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.805640 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-kubelet\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.805655 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.805715 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-cni-bin\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.805704 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.805758 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.805811 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-ovn\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.805845 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.805871 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-var-lib-openvswitch\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.805883 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.805915 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.805941 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-ovnkube-config\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.806015 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-slash\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.806055 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-node-log\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.806082 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/93569a52-4f36-4017-9834-b3651d6cd63e-ovn-node-metrics-cert\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.806100 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-log-socket\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.806102 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-slash" (OuterVolumeSpecName: "host-slash") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.806120 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqjws\" (UniqueName: \"kubernetes.io/projected/93569a52-4f36-4017-9834-b3651d6cd63e-kube-api-access-rqjws\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.806138 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-ovnkube-script-lib\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.806163 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-systemd\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.806161 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-node-log" (OuterVolumeSpecName: "node-log") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.806205 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.806182 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.807364 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-run-ovn-kubernetes\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.807449 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-cni-netd\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.807455 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.806225 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-log-socket" (OuterVolumeSpecName: "log-socket") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.807539 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.807559 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-run-netns\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.807004 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.807027 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.807658 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-openvswitch\") pod \"93569a52-4f36-4017-9834-b3651d6cd63e\" (UID: \"93569a52-4f36-4017-9834-b3651d6cd63e\") " Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.807688 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.807768 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.807997 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-log-socket\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.808055 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7099dbeb-e464-4315-8c7b-45426641e8b3-ovnkube-script-lib\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.808091 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztldm\" (UniqueName: \"kubernetes.io/projected/7099dbeb-e464-4315-8c7b-45426641e8b3-kube-api-access-ztldm\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.808154 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-cni-netd\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.808209 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-run-ovn\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.808368 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.808467 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-node-log\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.808543 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-cni-bin\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.808650 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-slash\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.808689 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-run-openvswitch\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.808726 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.808779 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-kubelet\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.808825 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7099dbeb-e464-4315-8c7b-45426641e8b3-ovn-node-metrics-cert\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.808926 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-etc-openvswitch\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.808971 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7099dbeb-e464-4315-8c7b-45426641e8b3-ovnkube-config\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.809018 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7099dbeb-e464-4315-8c7b-45426641e8b3-env-overrides\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.809068 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-run-netns\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.809543 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-systemd-units\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.809699 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-run-systemd\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.809816 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-var-lib-openvswitch\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.809960 4867 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810011 4867 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810043 4867 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810071 4867 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810099 4867 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810125 4867 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810151 4867 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810170 4867 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810191 4867 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810213 4867 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810305 4867 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810340 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810361 4867 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-slash\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810411 4867 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-node-log\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810454 4867 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-log-socket\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810477 4867 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/93569a52-4f36-4017-9834-b3651d6cd63e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.810502 4867 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.812504 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93569a52-4f36-4017-9834-b3651d6cd63e-kube-api-access-rqjws" (OuterVolumeSpecName: "kube-api-access-rqjws") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "kube-api-access-rqjws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.812695 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93569a52-4f36-4017-9834-b3651d6cd63e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.813767 4867 scope.go:117] "RemoveContainer" containerID="ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.822402 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "93569a52-4f36-4017-9834-b3651d6cd63e" (UID: "93569a52-4f36-4017-9834-b3651d6cd63e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.832811 4867 scope.go:117] "RemoveContainer" containerID="7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.846549 4867 scope.go:117] "RemoveContainer" containerID="ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.858889 4867 scope.go:117] "RemoveContainer" containerID="848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.876936 4867 scope.go:117] "RemoveContainer" containerID="17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.889103 4867 scope.go:117] "RemoveContainer" containerID="0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.908131 4867 scope.go:117] "RemoveContainer" containerID="57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911092 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-systemd-units\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911131 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-run-systemd\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911162 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-var-lib-openvswitch\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911184 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-log-socket\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911200 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-systemd-units\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911203 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7099dbeb-e464-4315-8c7b-45426641e8b3-ovnkube-script-lib\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911244 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztldm\" (UniqueName: \"kubernetes.io/projected/7099dbeb-e464-4315-8c7b-45426641e8b3-kube-api-access-ztldm\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911295 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-cni-netd\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911313 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-run-ovn\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911335 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911351 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-node-log\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911374 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-cni-bin\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911395 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-slash\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911413 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-run-openvswitch\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911433 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911452 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-kubelet\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911467 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7099dbeb-e464-4315-8c7b-45426641e8b3-ovn-node-metrics-cert\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911484 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-etc-openvswitch\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911482 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-run-systemd\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911566 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-cni-bin\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911621 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-run-ovn\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911504 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7099dbeb-e464-4315-8c7b-45426641e8b3-ovnkube-config\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911696 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911722 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7099dbeb-e464-4315-8c7b-45426641e8b3-env-overrides\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911760 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-node-log\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911497 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-var-lib-openvswitch\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911798 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-run-netns\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911819 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7099dbeb-e464-4315-8c7b-45426641e8b3-ovnkube-script-lib\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911834 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-run-ovn-kubernetes\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911915 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7099dbeb-e464-4315-8c7b-45426641e8b3-ovnkube-config\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.911979 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-log-socket\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.912000 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-run-netns\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.912015 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-kubelet\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.912058 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-etc-openvswitch\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.912370 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-cni-netd\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.912405 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-run-openvswitch\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.912447 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7099dbeb-e464-4315-8c7b-45426641e8b3-host-slash\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.912522 4867 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/93569a52-4f36-4017-9834-b3651d6cd63e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.912539 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqjws\" (UniqueName: \"kubernetes.io/projected/93569a52-4f36-4017-9834-b3651d6cd63e-kube-api-access-rqjws\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.912548 4867 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/93569a52-4f36-4017-9834-b3651d6cd63e-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.913226 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7099dbeb-e464-4315-8c7b-45426641e8b3-env-overrides\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.917757 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7099dbeb-e464-4315-8c7b-45426641e8b3-ovn-node-metrics-cert\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.926068 4867 scope.go:117] "RemoveContainer" containerID="e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.944557 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztldm\" (UniqueName: \"kubernetes.io/projected/7099dbeb-e464-4315-8c7b-45426641e8b3-kube-api-access-ztldm\") pod \"ovnkube-node-x5qw7\" (UID: \"7099dbeb-e464-4315-8c7b-45426641e8b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.945676 4867 scope.go:117] "RemoveContainer" containerID="0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.964211 4867 scope.go:117] "RemoveContainer" containerID="d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.965011 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69\": container with ID starting with d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69 not found: ID does not exist" containerID="d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.965105 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69"} err="failed to get container status \"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69\": rpc error: code = NotFound desc = could not find container \"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69\": container with ID starting with d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.965165 4867 scope.go:117] "RemoveContainer" containerID="ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.965640 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\": container with ID starting with ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c not found: ID does not exist" containerID="ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.965693 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c"} err="failed to get container status \"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\": rpc error: code = NotFound desc = could not find container \"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\": container with ID starting with ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.965719 4867 scope.go:117] "RemoveContainer" containerID="7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.966202 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\": container with ID starting with 7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9 not found: ID does not exist" containerID="7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.966337 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9"} err="failed to get container status \"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\": rpc error: code = NotFound desc = could not find container \"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\": container with ID starting with 7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.966394 4867 scope.go:117] "RemoveContainer" containerID="ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.966891 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\": container with ID starting with ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d not found: ID does not exist" containerID="ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.966934 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d"} err="failed to get container status \"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\": rpc error: code = NotFound desc = could not find container \"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\": container with ID starting with ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.966961 4867 scope.go:117] "RemoveContainer" containerID="848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.967735 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\": container with ID starting with 848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3 not found: ID does not exist" containerID="848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.967785 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3"} err="failed to get container status \"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\": rpc error: code = NotFound desc = could not find container \"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\": container with ID starting with 848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.967826 4867 scope.go:117] "RemoveContainer" containerID="17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.968217 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\": container with ID starting with 17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7 not found: ID does not exist" containerID="17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.968273 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7"} err="failed to get container status \"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\": rpc error: code = NotFound desc = could not find container \"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\": container with ID starting with 17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.968294 4867 scope.go:117] "RemoveContainer" containerID="0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.968580 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\": container with ID starting with 0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7 not found: ID does not exist" containerID="0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.968625 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7"} err="failed to get container status \"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\": rpc error: code = NotFound desc = could not find container \"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\": container with ID starting with 0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.968645 4867 scope.go:117] "RemoveContainer" containerID="57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.968928 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\": container with ID starting with 57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8 not found: ID does not exist" containerID="57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.968957 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8"} err="failed to get container status \"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\": rpc error: code = NotFound desc = could not find container \"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\": container with ID starting with 57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.968973 4867 scope.go:117] "RemoveContainer" containerID="e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.969243 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\": container with ID starting with e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315 not found: ID does not exist" containerID="e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.969315 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315"} err="failed to get container status \"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\": rpc error: code = NotFound desc = could not find container \"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\": container with ID starting with e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.969355 4867 scope.go:117] "RemoveContainer" containerID="0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38" Oct 06 13:13:28 crc kubenswrapper[4867]: E1006 13:13:28.969692 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\": container with ID starting with 0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38 not found: ID does not exist" containerID="0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.969723 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38"} err="failed to get container status \"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\": rpc error: code = NotFound desc = could not find container \"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\": container with ID starting with 0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.969741 4867 scope.go:117] "RemoveContainer" containerID="d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.970024 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69"} err="failed to get container status \"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69\": rpc error: code = NotFound desc = could not find container \"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69\": container with ID starting with d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.970061 4867 scope.go:117] "RemoveContainer" containerID="ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.970482 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c"} err="failed to get container status \"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\": rpc error: code = NotFound desc = could not find container \"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\": container with ID starting with ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.970517 4867 scope.go:117] "RemoveContainer" containerID="7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.970809 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9"} err="failed to get container status \"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\": rpc error: code = NotFound desc = could not find container \"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\": container with ID starting with 7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.970842 4867 scope.go:117] "RemoveContainer" containerID="ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.971118 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d"} err="failed to get container status \"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\": rpc error: code = NotFound desc = could not find container \"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\": container with ID starting with ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.971158 4867 scope.go:117] "RemoveContainer" containerID="848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.971577 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3"} err="failed to get container status \"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\": rpc error: code = NotFound desc = could not find container \"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\": container with ID starting with 848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.971609 4867 scope.go:117] "RemoveContainer" containerID="17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.971898 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7"} err="failed to get container status \"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\": rpc error: code = NotFound desc = could not find container \"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\": container with ID starting with 17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.971933 4867 scope.go:117] "RemoveContainer" containerID="0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.972178 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7"} err="failed to get container status \"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\": rpc error: code = NotFound desc = could not find container \"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\": container with ID starting with 0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.972208 4867 scope.go:117] "RemoveContainer" containerID="57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.972493 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8"} err="failed to get container status \"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\": rpc error: code = NotFound desc = could not find container \"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\": container with ID starting with 57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.972533 4867 scope.go:117] "RemoveContainer" containerID="e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.972830 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315"} err="failed to get container status \"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\": rpc error: code = NotFound desc = could not find container \"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\": container with ID starting with e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.972855 4867 scope.go:117] "RemoveContainer" containerID="0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.973247 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38"} err="failed to get container status \"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\": rpc error: code = NotFound desc = could not find container \"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\": container with ID starting with 0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.973350 4867 scope.go:117] "RemoveContainer" containerID="d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.973744 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69"} err="failed to get container status \"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69\": rpc error: code = NotFound desc = could not find container \"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69\": container with ID starting with d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.973773 4867 scope.go:117] "RemoveContainer" containerID="ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.974065 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c"} err="failed to get container status \"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\": rpc error: code = NotFound desc = could not find container \"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\": container with ID starting with ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.974112 4867 scope.go:117] "RemoveContainer" containerID="7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.974551 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9"} err="failed to get container status \"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\": rpc error: code = NotFound desc = could not find container \"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\": container with ID starting with 7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.974582 4867 scope.go:117] "RemoveContainer" containerID="ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.974888 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d"} err="failed to get container status \"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\": rpc error: code = NotFound desc = could not find container \"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\": container with ID starting with ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.974946 4867 scope.go:117] "RemoveContainer" containerID="848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.975366 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3"} err="failed to get container status \"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\": rpc error: code = NotFound desc = could not find container \"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\": container with ID starting with 848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.975393 4867 scope.go:117] "RemoveContainer" containerID="17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.975684 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7"} err="failed to get container status \"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\": rpc error: code = NotFound desc = could not find container \"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\": container with ID starting with 17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.975723 4867 scope.go:117] "RemoveContainer" containerID="0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.976129 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7"} err="failed to get container status \"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\": rpc error: code = NotFound desc = could not find container \"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\": container with ID starting with 0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.976157 4867 scope.go:117] "RemoveContainer" containerID="57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.976497 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8"} err="failed to get container status \"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\": rpc error: code = NotFound desc = could not find container \"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\": container with ID starting with 57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.976565 4867 scope.go:117] "RemoveContainer" containerID="e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.977026 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315"} err="failed to get container status \"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\": rpc error: code = NotFound desc = could not find container \"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\": container with ID starting with e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.977055 4867 scope.go:117] "RemoveContainer" containerID="0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.977406 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38"} err="failed to get container status \"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\": rpc error: code = NotFound desc = could not find container \"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\": container with ID starting with 0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.977452 4867 scope.go:117] "RemoveContainer" containerID="d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.977740 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69"} err="failed to get container status \"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69\": rpc error: code = NotFound desc = could not find container \"d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69\": container with ID starting with d6e25fca250a6407291b9ba9b89ae72278625cd443ea541d34a68dc26e463b69 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.977781 4867 scope.go:117] "RemoveContainer" containerID="ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.978076 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c"} err="failed to get container status \"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\": rpc error: code = NotFound desc = could not find container \"ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c\": container with ID starting with ab249f5afe0b1297aa67b8d8169d2643aa7f6c4a333d2e789cc9041c33ec857c not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.978111 4867 scope.go:117] "RemoveContainer" containerID="7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.978498 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9"} err="failed to get container status \"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\": rpc error: code = NotFound desc = could not find container \"7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9\": container with ID starting with 7faf2a23f45730fa2f8a630b4b87e4f440200129284bd636884f2064a7e5f7a9 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.978529 4867 scope.go:117] "RemoveContainer" containerID="ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.978829 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d"} err="failed to get container status \"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\": rpc error: code = NotFound desc = could not find container \"ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d\": container with ID starting with ccdd4e1f1e711625f4ff55e83ed97462f3653310431697d3d7ebc5f6cd7d7c3d not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.978860 4867 scope.go:117] "RemoveContainer" containerID="848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.979106 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3"} err="failed to get container status \"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\": rpc error: code = NotFound desc = could not find container \"848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3\": container with ID starting with 848b2bcaa7718bddb3d7a39d69b2a02d673cc09c7e456d1435d62c8f09db78d3 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.979132 4867 scope.go:117] "RemoveContainer" containerID="17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.979397 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7"} err="failed to get container status \"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\": rpc error: code = NotFound desc = could not find container \"17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7\": container with ID starting with 17842d7d276ed129d1759a229f62782112c98f1ae38b54f9b978f31d76286bd7 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.979422 4867 scope.go:117] "RemoveContainer" containerID="0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.980623 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7"} err="failed to get container status \"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\": rpc error: code = NotFound desc = could not find container \"0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7\": container with ID starting with 0adfa76190a4b96887cf6e2e9c0974cd083fc87acd28ea77d05b6409b2a4ebb7 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.980686 4867 scope.go:117] "RemoveContainer" containerID="57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.980996 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8"} err="failed to get container status \"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\": rpc error: code = NotFound desc = could not find container \"57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8\": container with ID starting with 57f5fdd8d47c6dcf2e51e8247b507db67cec24a51ca2e933f796c6ea0b8d2ed8 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.981027 4867 scope.go:117] "RemoveContainer" containerID="e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.981689 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315"} err="failed to get container status \"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\": rpc error: code = NotFound desc = could not find container \"e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315\": container with ID starting with e5a98ec65b6c37592b66e0a7e872c753282e4aebaf53cd418fef87042975b315 not found: ID does not exist" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.981728 4867 scope.go:117] "RemoveContainer" containerID="0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38" Oct 06 13:13:28 crc kubenswrapper[4867]: I1006 13:13:28.982076 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38"} err="failed to get container status \"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\": rpc error: code = NotFound desc = could not find container \"0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38\": container with ID starting with 0b2419476329cb6fea5397480032e6529e9cb09d34d95f454e56c1d271371f38 not found: ID does not exist" Oct 06 13:13:29 crc kubenswrapper[4867]: I1006 13:13:29.032300 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:29 crc kubenswrapper[4867]: I1006 13:13:29.102243 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zlc7z"] Oct 06 13:13:29 crc kubenswrapper[4867]: I1006 13:13:29.110080 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zlc7z"] Oct 06 13:13:29 crc kubenswrapper[4867]: I1006 13:13:29.241908 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93569a52-4f36-4017-9834-b3651d6cd63e" path="/var/lib/kubelet/pods/93569a52-4f36-4017-9834-b3651d6cd63e/volumes" Oct 06 13:13:29 crc kubenswrapper[4867]: I1006 13:13:29.768045 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-knnfm_8e3bebeb-f8c1-4b1e-a320-b937eced1c3a/kube-multus/2.log" Oct 06 13:13:29 crc kubenswrapper[4867]: I1006 13:13:29.769399 4867 generic.go:334] "Generic (PLEG): container finished" podID="7099dbeb-e464-4315-8c7b-45426641e8b3" containerID="6a741e174ce5e567a5bd966bc20ae023261a45f8235593f0c1e33e8006f90120" exitCode=0 Oct 06 13:13:29 crc kubenswrapper[4867]: I1006 13:13:29.769444 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" event={"ID":"7099dbeb-e464-4315-8c7b-45426641e8b3","Type":"ContainerDied","Data":"6a741e174ce5e567a5bd966bc20ae023261a45f8235593f0c1e33e8006f90120"} Oct 06 13:13:29 crc kubenswrapper[4867]: I1006 13:13:29.769611 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" event={"ID":"7099dbeb-e464-4315-8c7b-45426641e8b3","Type":"ContainerStarted","Data":"7cd6a01d942758d32b3af44c44f282da8401ee64e9003fcb88762f267e8edc6c"} Oct 06 13:13:30 crc kubenswrapper[4867]: I1006 13:13:30.784026 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" event={"ID":"7099dbeb-e464-4315-8c7b-45426641e8b3","Type":"ContainerStarted","Data":"9af3444b62de3e5e7c5c25b430f69b6fe7a9889c6b943b74b052bc6eaa47177d"} Oct 06 13:13:30 crc kubenswrapper[4867]: I1006 13:13:30.785099 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" event={"ID":"7099dbeb-e464-4315-8c7b-45426641e8b3","Type":"ContainerStarted","Data":"3d01615f0aa7c61aeb2fcebabd146fb3ebc94fcc7bb3b4e8ca21df0976ec7777"} Oct 06 13:13:30 crc kubenswrapper[4867]: I1006 13:13:30.785140 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" event={"ID":"7099dbeb-e464-4315-8c7b-45426641e8b3","Type":"ContainerStarted","Data":"6432b842efb120273b3c08c0d9d491970f1fe444d5b268cd26a829f69c526a47"} Oct 06 13:13:30 crc kubenswrapper[4867]: I1006 13:13:30.785167 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" event={"ID":"7099dbeb-e464-4315-8c7b-45426641e8b3","Type":"ContainerStarted","Data":"83185ad9e023688e762df5bf0e8a215c979ef3495d3f9882a9d37582f036d772"} Oct 06 13:13:30 crc kubenswrapper[4867]: I1006 13:13:30.785191 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" event={"ID":"7099dbeb-e464-4315-8c7b-45426641e8b3","Type":"ContainerStarted","Data":"d50e5f234c462669dbbae89e442e986ec076382ad8304f9cd6da944e0fcb3e91"} Oct 06 13:13:30 crc kubenswrapper[4867]: I1006 13:13:30.785210 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" event={"ID":"7099dbeb-e464-4315-8c7b-45426641e8b3","Type":"ContainerStarted","Data":"cb52c6278b4982015b3d167b0c658f5df15be9025904e934c9120fb6849a5a60"} Oct 06 13:13:33 crc kubenswrapper[4867]: I1006 13:13:33.816468 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" event={"ID":"7099dbeb-e464-4315-8c7b-45426641e8b3","Type":"ContainerStarted","Data":"c82abe71461db4fd60efc8473cecf4399046e5ab0e2014e6293334d4c6dbd7d7"} Oct 06 13:13:35 crc kubenswrapper[4867]: I1006 13:13:35.837950 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" event={"ID":"7099dbeb-e464-4315-8c7b-45426641e8b3","Type":"ContainerStarted","Data":"70fde2ebb70a9732150acbfc10a3e9ec533d22f6c994bb083086bba8057c4ffb"} Oct 06 13:13:35 crc kubenswrapper[4867]: I1006 13:13:35.838520 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:35 crc kubenswrapper[4867]: I1006 13:13:35.838568 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:35 crc kubenswrapper[4867]: I1006 13:13:35.838581 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:35 crc kubenswrapper[4867]: I1006 13:13:35.872842 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" podStartSLOduration=7.872814283 podStartE2EDuration="7.872814283s" podCreationTimestamp="2025-10-06 13:13:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:13:35.872520824 +0000 UTC m=+595.330468978" watchObservedRunningTime="2025-10-06 13:13:35.872814283 +0000 UTC m=+595.330762437" Oct 06 13:13:35 crc kubenswrapper[4867]: I1006 13:13:35.877833 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:35 crc kubenswrapper[4867]: I1006 13:13:35.882821 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:13:40 crc kubenswrapper[4867]: I1006 13:13:40.222517 4867 scope.go:117] "RemoveContainer" containerID="069d91feced47fa7dd985fd0691e86c74dc903221691f3caea33965e465d529f" Oct 06 13:13:40 crc kubenswrapper[4867]: E1006 13:13:40.223783 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-knnfm_openshift-multus(8e3bebeb-f8c1-4b1e-a320-b937eced1c3a)\"" pod="openshift-multus/multus-knnfm" podUID="8e3bebeb-f8c1-4b1e-a320-b937eced1c3a" Oct 06 13:13:42 crc kubenswrapper[4867]: I1006 13:13:42.873757 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:13:42 crc kubenswrapper[4867]: I1006 13:13:42.874164 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:13:53 crc kubenswrapper[4867]: I1006 13:13:53.221888 4867 scope.go:117] "RemoveContainer" containerID="069d91feced47fa7dd985fd0691e86c74dc903221691f3caea33965e465d529f" Oct 06 13:13:53 crc kubenswrapper[4867]: I1006 13:13:53.992745 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-knnfm_8e3bebeb-f8c1-4b1e-a320-b937eced1c3a/kube-multus/2.log" Oct 06 13:13:53 crc kubenswrapper[4867]: I1006 13:13:53.993319 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-knnfm" event={"ID":"8e3bebeb-f8c1-4b1e-a320-b937eced1c3a","Type":"ContainerStarted","Data":"55ae300f263b2eb988b4906001783a8e5fe53bc454b0597fab3cef251ad557b0"} Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.332006 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2"] Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.336624 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.341575 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2"] Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.344831 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.393179 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x68j2\" (UniqueName: \"kubernetes.io/projected/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-kube-api-access-x68j2\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2\" (UID: \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.393242 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2\" (UID: \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.393294 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2\" (UID: \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.494594 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2\" (UID: \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.494664 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2\" (UID: \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.494748 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x68j2\" (UniqueName: \"kubernetes.io/projected/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-kube-api-access-x68j2\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2\" (UID: \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.495696 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2\" (UID: \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.495867 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2\" (UID: \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.516987 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x68j2\" (UniqueName: \"kubernetes.io/projected/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-kube-api-access-x68j2\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2\" (UID: \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.671183 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" Oct 06 13:13:57 crc kubenswrapper[4867]: I1006 13:13:57.932813 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2"] Oct 06 13:13:58 crc kubenswrapper[4867]: I1006 13:13:58.022740 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" event={"ID":"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be","Type":"ContainerStarted","Data":"ea52405386a85d8bc4c023b425a5062aeae48cad4223be7ffb1a9f556caca103"} Oct 06 13:13:59 crc kubenswrapper[4867]: I1006 13:13:59.030341 4867 generic.go:334] "Generic (PLEG): container finished" podID="0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be" containerID="3f9a5c46b2cdc1215a7b82d4556b490d0f1fc68bbcdee5688bf93702ba649796" exitCode=0 Oct 06 13:13:59 crc kubenswrapper[4867]: I1006 13:13:59.030787 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" event={"ID":"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be","Type":"ContainerDied","Data":"3f9a5c46b2cdc1215a7b82d4556b490d0f1fc68bbcdee5688bf93702ba649796"} Oct 06 13:13:59 crc kubenswrapper[4867]: I1006 13:13:59.062767 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x5qw7" Oct 06 13:14:03 crc kubenswrapper[4867]: I1006 13:14:03.064652 4867 generic.go:334] "Generic (PLEG): container finished" podID="0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be" containerID="aa2d2cd8303d5d73d1632991972d131dba85877f52e68d0a3d80ea17156fd89f" exitCode=0 Oct 06 13:14:03 crc kubenswrapper[4867]: I1006 13:14:03.064751 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" event={"ID":"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be","Type":"ContainerDied","Data":"aa2d2cd8303d5d73d1632991972d131dba85877f52e68d0a3d80ea17156fd89f"} Oct 06 13:14:04 crc kubenswrapper[4867]: I1006 13:14:04.079476 4867 generic.go:334] "Generic (PLEG): container finished" podID="0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be" containerID="165b7a4f20fc4e2cd640d3e4d3da828d3b3ca56de1ccc8e3a9952492e8d0787c" exitCode=0 Oct 06 13:14:04 crc kubenswrapper[4867]: I1006 13:14:04.079570 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" event={"ID":"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be","Type":"ContainerDied","Data":"165b7a4f20fc4e2cd640d3e4d3da828d3b3ca56de1ccc8e3a9952492e8d0787c"} Oct 06 13:14:05 crc kubenswrapper[4867]: I1006 13:14:05.316271 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" Oct 06 13:14:05 crc kubenswrapper[4867]: I1006 13:14:05.419326 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-bundle\") pod \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\" (UID: \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\") " Oct 06 13:14:05 crc kubenswrapper[4867]: I1006 13:14:05.419440 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-util\") pod \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\" (UID: \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\") " Oct 06 13:14:05 crc kubenswrapper[4867]: I1006 13:14:05.419535 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x68j2\" (UniqueName: \"kubernetes.io/projected/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-kube-api-access-x68j2\") pod \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\" (UID: \"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be\") " Oct 06 13:14:05 crc kubenswrapper[4867]: I1006 13:14:05.421673 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-bundle" (OuterVolumeSpecName: "bundle") pod "0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be" (UID: "0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:14:05 crc kubenswrapper[4867]: I1006 13:14:05.427524 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-kube-api-access-x68j2" (OuterVolumeSpecName: "kube-api-access-x68j2") pod "0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be" (UID: "0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be"). InnerVolumeSpecName "kube-api-access-x68j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:14:05 crc kubenswrapper[4867]: I1006 13:14:05.432430 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-util" (OuterVolumeSpecName: "util") pod "0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be" (UID: "0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:14:05 crc kubenswrapper[4867]: I1006 13:14:05.521420 4867 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:14:05 crc kubenswrapper[4867]: I1006 13:14:05.521459 4867 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-util\") on node \"crc\" DevicePath \"\"" Oct 06 13:14:05 crc kubenswrapper[4867]: I1006 13:14:05.521472 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x68j2\" (UniqueName: \"kubernetes.io/projected/0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be-kube-api-access-x68j2\") on node \"crc\" DevicePath \"\"" Oct 06 13:14:06 crc kubenswrapper[4867]: I1006 13:14:06.098394 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" event={"ID":"0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be","Type":"ContainerDied","Data":"ea52405386a85d8bc4c023b425a5062aeae48cad4223be7ffb1a9f556caca103"} Oct 06 13:14:06 crc kubenswrapper[4867]: I1006 13:14:06.098891 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea52405386a85d8bc4c023b425a5062aeae48cad4223be7ffb1a9f556caca103" Oct 06 13:14:06 crc kubenswrapper[4867]: I1006 13:14:06.098556 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2" Oct 06 13:14:12 crc kubenswrapper[4867]: I1006 13:14:12.873367 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:14:12 crc kubenswrapper[4867]: I1006 13:14:12.873994 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:14:12 crc kubenswrapper[4867]: I1006 13:14:12.874066 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:14:12 crc kubenswrapper[4867]: I1006 13:14:12.874889 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e6c0a282c79916c755a5288bb4eb5014e330cf8dede8b87679a8dc2b50be474"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:14:12 crc kubenswrapper[4867]: I1006 13:14:12.874956 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://9e6c0a282c79916c755a5288bb4eb5014e330cf8dede8b87679a8dc2b50be474" gracePeriod=600 Oct 06 13:14:13 crc kubenswrapper[4867]: I1006 13:14:13.139895 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="9e6c0a282c79916c755a5288bb4eb5014e330cf8dede8b87679a8dc2b50be474" exitCode=0 Oct 06 13:14:13 crc kubenswrapper[4867]: I1006 13:14:13.139983 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"9e6c0a282c79916c755a5288bb4eb5014e330cf8dede8b87679a8dc2b50be474"} Oct 06 13:14:13 crc kubenswrapper[4867]: I1006 13:14:13.140278 4867 scope.go:117] "RemoveContainer" containerID="7ebc66b3368265481d7ce17a498c1898a6fa0d78101df6ffd71b9d951872175a" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.146223 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"21254038d2e08625414f4e3fd77d4aa603650bf9aa5cea1080c49abec73a2651"} Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.635750 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-rxd2f"] Oct 06 13:14:14 crc kubenswrapper[4867]: E1006 13:14:14.636021 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be" containerName="extract" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.636038 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be" containerName="extract" Oct 06 13:14:14 crc kubenswrapper[4867]: E1006 13:14:14.636052 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be" containerName="pull" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.636057 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be" containerName="pull" Oct 06 13:14:14 crc kubenswrapper[4867]: E1006 13:14:14.636073 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be" containerName="util" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.636081 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be" containerName="util" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.636208 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be" containerName="extract" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.636658 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-rxd2f" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.640485 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.641409 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.642222 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-zd62k" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.650354 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-rxd2f"] Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.750870 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n"] Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.751847 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.752223 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spwzj\" (UniqueName: \"kubernetes.io/projected/1b226da8-0bf8-4ead-b308-6677288373a3-kube-api-access-spwzj\") pod \"obo-prometheus-operator-7c8cf85677-rxd2f\" (UID: \"1b226da8-0bf8-4ead-b308-6677288373a3\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-rxd2f" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.754994 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-dmqsm" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.756131 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.776845 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-blr54"] Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.778010 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-blr54" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.792691 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-blr54"] Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.816834 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n"] Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.853900 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c780336-2ad2-49ef-97b4-0161e4dceb44-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c667696bd-blr54\" (UID: \"4c780336-2ad2-49ef-97b4-0161e4dceb44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-blr54" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.853960 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b47e5b18-abb6-4dc9-bc90-c37e31034183-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c667696bd-98f7n\" (UID: \"b47e5b18-abb6-4dc9-bc90-c37e31034183\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.853989 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c780336-2ad2-49ef-97b4-0161e4dceb44-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c667696bd-blr54\" (UID: \"4c780336-2ad2-49ef-97b4-0161e4dceb44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-blr54" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.854067 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b47e5b18-abb6-4dc9-bc90-c37e31034183-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c667696bd-98f7n\" (UID: \"b47e5b18-abb6-4dc9-bc90-c37e31034183\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.854094 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spwzj\" (UniqueName: \"kubernetes.io/projected/1b226da8-0bf8-4ead-b308-6677288373a3-kube-api-access-spwzj\") pod \"obo-prometheus-operator-7c8cf85677-rxd2f\" (UID: \"1b226da8-0bf8-4ead-b308-6677288373a3\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-rxd2f" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.872990 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spwzj\" (UniqueName: \"kubernetes.io/projected/1b226da8-0bf8-4ead-b308-6677288373a3-kube-api-access-spwzj\") pod \"obo-prometheus-operator-7c8cf85677-rxd2f\" (UID: \"1b226da8-0bf8-4ead-b308-6677288373a3\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-rxd2f" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.955794 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c780336-2ad2-49ef-97b4-0161e4dceb44-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c667696bd-blr54\" (UID: \"4c780336-2ad2-49ef-97b4-0161e4dceb44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-blr54" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.955848 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b47e5b18-abb6-4dc9-bc90-c37e31034183-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c667696bd-98f7n\" (UID: \"b47e5b18-abb6-4dc9-bc90-c37e31034183\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.955867 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c780336-2ad2-49ef-97b4-0161e4dceb44-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c667696bd-blr54\" (UID: \"4c780336-2ad2-49ef-97b4-0161e4dceb44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-blr54" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.955914 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b47e5b18-abb6-4dc9-bc90-c37e31034183-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c667696bd-98f7n\" (UID: \"b47e5b18-abb6-4dc9-bc90-c37e31034183\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.959687 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c780336-2ad2-49ef-97b4-0161e4dceb44-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c667696bd-blr54\" (UID: \"4c780336-2ad2-49ef-97b4-0161e4dceb44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-blr54" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.960124 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-rxd2f" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.962619 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c780336-2ad2-49ef-97b4-0161e4dceb44-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c667696bd-blr54\" (UID: \"4c780336-2ad2-49ef-97b4-0161e4dceb44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-blr54" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.966515 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-pkjkr"] Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.967638 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-pkjkr" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.967878 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b47e5b18-abb6-4dc9-bc90-c37e31034183-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c667696bd-98f7n\" (UID: \"b47e5b18-abb6-4dc9-bc90-c37e31034183\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.972315 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b47e5b18-abb6-4dc9-bc90-c37e31034183-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c667696bd-98f7n\" (UID: \"b47e5b18-abb6-4dc9-bc90-c37e31034183\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.973166 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.975124 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-gcjzx" Oct 06 13:14:14 crc kubenswrapper[4867]: I1006 13:14:14.989942 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-pkjkr"] Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.057721 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4f4e099-818f-4e18-b1d2-dc026962eb51-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-pkjkr\" (UID: \"d4f4e099-818f-4e18-b1d2-dc026962eb51\") " pod="openshift-operators/observability-operator-cc5f78dfc-pkjkr" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.057851 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ms9z\" (UniqueName: \"kubernetes.io/projected/d4f4e099-818f-4e18-b1d2-dc026962eb51-kube-api-access-5ms9z\") pod \"observability-operator-cc5f78dfc-pkjkr\" (UID: \"d4f4e099-818f-4e18-b1d2-dc026962eb51\") " pod="openshift-operators/observability-operator-cc5f78dfc-pkjkr" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.068611 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.098714 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-blr54" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.159383 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4f4e099-818f-4e18-b1d2-dc026962eb51-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-pkjkr\" (UID: \"d4f4e099-818f-4e18-b1d2-dc026962eb51\") " pod="openshift-operators/observability-operator-cc5f78dfc-pkjkr" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.159445 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ms9z\" (UniqueName: \"kubernetes.io/projected/d4f4e099-818f-4e18-b1d2-dc026962eb51-kube-api-access-5ms9z\") pod \"observability-operator-cc5f78dfc-pkjkr\" (UID: \"d4f4e099-818f-4e18-b1d2-dc026962eb51\") " pod="openshift-operators/observability-operator-cc5f78dfc-pkjkr" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.173357 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4f4e099-818f-4e18-b1d2-dc026962eb51-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-pkjkr\" (UID: \"d4f4e099-818f-4e18-b1d2-dc026962eb51\") " pod="openshift-operators/observability-operator-cc5f78dfc-pkjkr" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.175777 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-zm28w"] Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.178371 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-zm28w" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.181852 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-bf8nl" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.185036 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ms9z\" (UniqueName: \"kubernetes.io/projected/d4f4e099-818f-4e18-b1d2-dc026962eb51-kube-api-access-5ms9z\") pod \"observability-operator-cc5f78dfc-pkjkr\" (UID: \"d4f4e099-818f-4e18-b1d2-dc026962eb51\") " pod="openshift-operators/observability-operator-cc5f78dfc-pkjkr" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.196123 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-zm28w"] Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.261120 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6z6j\" (UniqueName: \"kubernetes.io/projected/cb9ae008-7e15-4aa1-84fa-93f513646286-kube-api-access-s6z6j\") pod \"perses-operator-54bc95c9fb-zm28w\" (UID: \"cb9ae008-7e15-4aa1-84fa-93f513646286\") " pod="openshift-operators/perses-operator-54bc95c9fb-zm28w" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.261198 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/cb9ae008-7e15-4aa1-84fa-93f513646286-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-zm28w\" (UID: \"cb9ae008-7e15-4aa1-84fa-93f513646286\") " pod="openshift-operators/perses-operator-54bc95c9fb-zm28w" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.287754 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-rxd2f"] Oct 06 13:14:15 crc kubenswrapper[4867]: W1006 13:14:15.314559 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b226da8_0bf8_4ead_b308_6677288373a3.slice/crio-1b980b5636ac5520fd5563ca1a3640ece45e814d7cba578aa4db426a5bc9345d WatchSource:0}: Error finding container 1b980b5636ac5520fd5563ca1a3640ece45e814d7cba578aa4db426a5bc9345d: Status 404 returned error can't find the container with id 1b980b5636ac5520fd5563ca1a3640ece45e814d7cba578aa4db426a5bc9345d Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.349015 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-pkjkr" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.362162 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6z6j\" (UniqueName: \"kubernetes.io/projected/cb9ae008-7e15-4aa1-84fa-93f513646286-kube-api-access-s6z6j\") pod \"perses-operator-54bc95c9fb-zm28w\" (UID: \"cb9ae008-7e15-4aa1-84fa-93f513646286\") " pod="openshift-operators/perses-operator-54bc95c9fb-zm28w" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.362235 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/cb9ae008-7e15-4aa1-84fa-93f513646286-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-zm28w\" (UID: \"cb9ae008-7e15-4aa1-84fa-93f513646286\") " pod="openshift-operators/perses-operator-54bc95c9fb-zm28w" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.363103 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/cb9ae008-7e15-4aa1-84fa-93f513646286-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-zm28w\" (UID: \"cb9ae008-7e15-4aa1-84fa-93f513646286\") " pod="openshift-operators/perses-operator-54bc95c9fb-zm28w" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.396280 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6z6j\" (UniqueName: \"kubernetes.io/projected/cb9ae008-7e15-4aa1-84fa-93f513646286-kube-api-access-s6z6j\") pod \"perses-operator-54bc95c9fb-zm28w\" (UID: \"cb9ae008-7e15-4aa1-84fa-93f513646286\") " pod="openshift-operators/perses-operator-54bc95c9fb-zm28w" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.400595 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n"] Oct 06 13:14:15 crc kubenswrapper[4867]: W1006 13:14:15.408989 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb47e5b18_abb6_4dc9_bc90_c37e31034183.slice/crio-032eab96c9e4628624c724af5bec9b023d51f1df4e028631deb3513b6685b31b WatchSource:0}: Error finding container 032eab96c9e4628624c724af5bec9b023d51f1df4e028631deb3513b6685b31b: Status 404 returned error can't find the container with id 032eab96c9e4628624c724af5bec9b023d51f1df4e028631deb3513b6685b31b Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.469684 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-blr54"] Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.524688 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-zm28w" Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.585643 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-pkjkr"] Oct 06 13:14:15 crc kubenswrapper[4867]: W1006 13:14:15.603957 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4f4e099_818f_4e18_b1d2_dc026962eb51.slice/crio-5d7c248947e49f61bfc2a23d76390f5c86d333a2bdc80eff18a85560d932b1d7 WatchSource:0}: Error finding container 5d7c248947e49f61bfc2a23d76390f5c86d333a2bdc80eff18a85560d932b1d7: Status 404 returned error can't find the container with id 5d7c248947e49f61bfc2a23d76390f5c86d333a2bdc80eff18a85560d932b1d7 Oct 06 13:14:15 crc kubenswrapper[4867]: I1006 13:14:15.984666 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-zm28w"] Oct 06 13:14:15 crc kubenswrapper[4867]: W1006 13:14:15.994829 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb9ae008_7e15_4aa1_84fa_93f513646286.slice/crio-1dad44dd2809c8a4494c7e3280fdbdba0ae5aec30eab42beefc0faf55413fdb0 WatchSource:0}: Error finding container 1dad44dd2809c8a4494c7e3280fdbdba0ae5aec30eab42beefc0faf55413fdb0: Status 404 returned error can't find the container with id 1dad44dd2809c8a4494c7e3280fdbdba0ae5aec30eab42beefc0faf55413fdb0 Oct 06 13:14:16 crc kubenswrapper[4867]: I1006 13:14:16.165745 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n" event={"ID":"b47e5b18-abb6-4dc9-bc90-c37e31034183","Type":"ContainerStarted","Data":"032eab96c9e4628624c724af5bec9b023d51f1df4e028631deb3513b6685b31b"} Oct 06 13:14:16 crc kubenswrapper[4867]: I1006 13:14:16.166783 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-pkjkr" event={"ID":"d4f4e099-818f-4e18-b1d2-dc026962eb51","Type":"ContainerStarted","Data":"5d7c248947e49f61bfc2a23d76390f5c86d333a2bdc80eff18a85560d932b1d7"} Oct 06 13:14:16 crc kubenswrapper[4867]: I1006 13:14:16.167741 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-zm28w" event={"ID":"cb9ae008-7e15-4aa1-84fa-93f513646286","Type":"ContainerStarted","Data":"1dad44dd2809c8a4494c7e3280fdbdba0ae5aec30eab42beefc0faf55413fdb0"} Oct 06 13:14:16 crc kubenswrapper[4867]: I1006 13:14:16.168760 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-blr54" event={"ID":"4c780336-2ad2-49ef-97b4-0161e4dceb44","Type":"ContainerStarted","Data":"5b89342e1a3ec90f21b2ea9a420b3d449d74e7a9314eafcedb474f182f4b0f16"} Oct 06 13:14:16 crc kubenswrapper[4867]: I1006 13:14:16.169662 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-rxd2f" event={"ID":"1b226da8-0bf8-4ead-b308-6677288373a3","Type":"ContainerStarted","Data":"1b980b5636ac5520fd5563ca1a3640ece45e814d7cba578aa4db426a5bc9345d"} Oct 06 13:14:32 crc kubenswrapper[4867]: E1006 13:14:32.239435 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d" Oct 06 13:14:32 crc kubenswrapper[4867]: E1006 13:14:32.240511 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-7c667696bd-98f7n_openshift-operators(b47e5b18-abb6-4dc9-bc90-c37e31034183): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 13:14:32 crc kubenswrapper[4867]: E1006 13:14:32.241713 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n" podUID="b47e5b18-abb6-4dc9-bc90-c37e31034183" Oct 06 13:14:33 crc kubenswrapper[4867]: I1006 13:14:33.303495 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-blr54" event={"ID":"4c780336-2ad2-49ef-97b4-0161e4dceb44","Type":"ContainerStarted","Data":"60c214b63baf2bfe587b88bd2436c5aaca6d45a65b144d6046070683aa8ff661"} Oct 06 13:14:33 crc kubenswrapper[4867]: I1006 13:14:33.305436 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-rxd2f" event={"ID":"1b226da8-0bf8-4ead-b308-6677288373a3","Type":"ContainerStarted","Data":"6889b31afbf954c4fd89622caca239eb1b9b12163a78872e75c33c4fd36c9c9b"} Oct 06 13:14:33 crc kubenswrapper[4867]: I1006 13:14:33.306617 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n" event={"ID":"b47e5b18-abb6-4dc9-bc90-c37e31034183","Type":"ContainerStarted","Data":"4c2604d81750f9fbc79f0b249b1cc6cf925eb40b443978e285066ea8a4b64aa2"} Oct 06 13:14:33 crc kubenswrapper[4867]: I1006 13:14:33.308109 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-pkjkr" event={"ID":"d4f4e099-818f-4e18-b1d2-dc026962eb51","Type":"ContainerStarted","Data":"6f187240373ecaf3df4e9e524bfce9ff3608a66c683cf98dc2cb0b6205c9568a"} Oct 06 13:14:33 crc kubenswrapper[4867]: I1006 13:14:33.308522 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-pkjkr" Oct 06 13:14:33 crc kubenswrapper[4867]: I1006 13:14:33.309168 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-zm28w" event={"ID":"cb9ae008-7e15-4aa1-84fa-93f513646286","Type":"ContainerStarted","Data":"475919097a792091f8f45ca21af8c1e2739ba81e2811336a3b2e0a1d953311db"} Oct 06 13:14:33 crc kubenswrapper[4867]: I1006 13:14:33.309552 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-zm28w" Oct 06 13:14:33 crc kubenswrapper[4867]: I1006 13:14:33.335510 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-blr54" podStartSLOduration=2.47028755 podStartE2EDuration="19.335490512s" podCreationTimestamp="2025-10-06 13:14:14 +0000 UTC" firstStartedPulling="2025-10-06 13:14:15.5013536 +0000 UTC m=+634.959301744" lastFinishedPulling="2025-10-06 13:14:32.366556562 +0000 UTC m=+651.824504706" observedRunningTime="2025-10-06 13:14:33.329476808 +0000 UTC m=+652.787424962" watchObservedRunningTime="2025-10-06 13:14:33.335490512 +0000 UTC m=+652.793438656" Oct 06 13:14:33 crc kubenswrapper[4867]: I1006 13:14:33.361679 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-zm28w" podStartSLOduration=2.046043727 podStartE2EDuration="18.361650784s" podCreationTimestamp="2025-10-06 13:14:15 +0000 UTC" firstStartedPulling="2025-10-06 13:14:15.997560921 +0000 UTC m=+635.455509065" lastFinishedPulling="2025-10-06 13:14:32.313167978 +0000 UTC m=+651.771116122" observedRunningTime="2025-10-06 13:14:33.356200416 +0000 UTC m=+652.814148550" watchObservedRunningTime="2025-10-06 13:14:33.361650784 +0000 UTC m=+652.819598928" Oct 06 13:14:33 crc kubenswrapper[4867]: I1006 13:14:33.383867 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-rxd2f" podStartSLOduration=2.414360829 podStartE2EDuration="19.383840839s" podCreationTimestamp="2025-10-06 13:14:14 +0000 UTC" firstStartedPulling="2025-10-06 13:14:15.319485985 +0000 UTC m=+634.777434129" lastFinishedPulling="2025-10-06 13:14:32.288965995 +0000 UTC m=+651.746914139" observedRunningTime="2025-10-06 13:14:33.379983794 +0000 UTC m=+652.837931938" watchObservedRunningTime="2025-10-06 13:14:33.383840839 +0000 UTC m=+652.841788973" Oct 06 13:14:33 crc kubenswrapper[4867]: I1006 13:14:33.386407 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-pkjkr" Oct 06 13:14:33 crc kubenswrapper[4867]: I1006 13:14:33.412343 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-pkjkr" podStartSLOduration=2.652696476 podStartE2EDuration="19.412316564s" podCreationTimestamp="2025-10-06 13:14:14 +0000 UTC" firstStartedPulling="2025-10-06 13:14:15.606910813 +0000 UTC m=+635.064858957" lastFinishedPulling="2025-10-06 13:14:32.366530901 +0000 UTC m=+651.824479045" observedRunningTime="2025-10-06 13:14:33.410483044 +0000 UTC m=+652.868431208" watchObservedRunningTime="2025-10-06 13:14:33.412316564 +0000 UTC m=+652.870264708" Oct 06 13:14:33 crc kubenswrapper[4867]: I1006 13:14:33.436095 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c667696bd-98f7n" podStartSLOduration=-9223372017.418705 podStartE2EDuration="19.436070441s" podCreationTimestamp="2025-10-06 13:14:14 +0000 UTC" firstStartedPulling="2025-10-06 13:14:15.411047735 +0000 UTC m=+634.868995879" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:14:33.432566255 +0000 UTC m=+652.890514399" watchObservedRunningTime="2025-10-06 13:14:33.436070441 +0000 UTC m=+652.894018585" Oct 06 13:14:45 crc kubenswrapper[4867]: I1006 13:14:45.529042 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-zm28w" Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.138859 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh"] Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.140584 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.145044 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.145905 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.151377 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh"] Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.286410 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-secret-volume\") pod \"collect-profiles-29329275-bp6nh\" (UID: \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.286489 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbd7r\" (UniqueName: \"kubernetes.io/projected/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-kube-api-access-wbd7r\") pod \"collect-profiles-29329275-bp6nh\" (UID: \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.286520 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-config-volume\") pod \"collect-profiles-29329275-bp6nh\" (UID: \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.387722 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-config-volume\") pod \"collect-profiles-29329275-bp6nh\" (UID: \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.387882 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-secret-volume\") pod \"collect-profiles-29329275-bp6nh\" (UID: \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.387922 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbd7r\" (UniqueName: \"kubernetes.io/projected/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-kube-api-access-wbd7r\") pod \"collect-profiles-29329275-bp6nh\" (UID: \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.389654 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-config-volume\") pod \"collect-profiles-29329275-bp6nh\" (UID: \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.396147 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-secret-volume\") pod \"collect-profiles-29329275-bp6nh\" (UID: \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.408613 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbd7r\" (UniqueName: \"kubernetes.io/projected/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-kube-api-access-wbd7r\") pod \"collect-profiles-29329275-bp6nh\" (UID: \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.460049 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" Oct 06 13:15:00 crc kubenswrapper[4867]: I1006 13:15:00.910305 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh"] Oct 06 13:15:01 crc kubenswrapper[4867]: I1006 13:15:01.496054 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" event={"ID":"b8a8b845-d9b0-4111-b9f2-01d31c27fefe","Type":"ContainerStarted","Data":"41b406e3e6737e8becb91e76ea65fc67b6c220088aba022643bd566cd6dff115"} Oct 06 13:15:02 crc kubenswrapper[4867]: I1006 13:15:02.504618 4867 generic.go:334] "Generic (PLEG): container finished" podID="b8a8b845-d9b0-4111-b9f2-01d31c27fefe" containerID="48072206700e8e52d49ddad0cd2d2e4e13327965a3654a4cfb8f24e4fa2ff144" exitCode=0 Oct 06 13:15:02 crc kubenswrapper[4867]: I1006 13:15:02.505115 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" event={"ID":"b8a8b845-d9b0-4111-b9f2-01d31c27fefe","Type":"ContainerDied","Data":"48072206700e8e52d49ddad0cd2d2e4e13327965a3654a4cfb8f24e4fa2ff144"} Oct 06 13:15:02 crc kubenswrapper[4867]: I1006 13:15:02.835377 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc"] Oct 06 13:15:02 crc kubenswrapper[4867]: I1006 13:15:02.836551 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" Oct 06 13:15:02 crc kubenswrapper[4867]: I1006 13:15:02.839856 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 13:15:02 crc kubenswrapper[4867]: I1006 13:15:02.848855 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc"] Oct 06 13:15:02 crc kubenswrapper[4867]: I1006 13:15:02.931774 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2548a3ec-5354-4309-a045-1a29253ad94b-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc\" (UID: \"2548a3ec-5354-4309-a045-1a29253ad94b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" Oct 06 13:15:02 crc kubenswrapper[4867]: I1006 13:15:02.931858 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr7b4\" (UniqueName: \"kubernetes.io/projected/2548a3ec-5354-4309-a045-1a29253ad94b-kube-api-access-rr7b4\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc\" (UID: \"2548a3ec-5354-4309-a045-1a29253ad94b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" Oct 06 13:15:02 crc kubenswrapper[4867]: I1006 13:15:02.932058 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2548a3ec-5354-4309-a045-1a29253ad94b-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc\" (UID: \"2548a3ec-5354-4309-a045-1a29253ad94b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.033465 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr7b4\" (UniqueName: \"kubernetes.io/projected/2548a3ec-5354-4309-a045-1a29253ad94b-kube-api-access-rr7b4\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc\" (UID: \"2548a3ec-5354-4309-a045-1a29253ad94b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.033791 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2548a3ec-5354-4309-a045-1a29253ad94b-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc\" (UID: \"2548a3ec-5354-4309-a045-1a29253ad94b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.033946 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2548a3ec-5354-4309-a045-1a29253ad94b-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc\" (UID: \"2548a3ec-5354-4309-a045-1a29253ad94b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.034678 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2548a3ec-5354-4309-a045-1a29253ad94b-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc\" (UID: \"2548a3ec-5354-4309-a045-1a29253ad94b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.034769 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2548a3ec-5354-4309-a045-1a29253ad94b-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc\" (UID: \"2548a3ec-5354-4309-a045-1a29253ad94b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.065898 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr7b4\" (UniqueName: \"kubernetes.io/projected/2548a3ec-5354-4309-a045-1a29253ad94b-kube-api-access-rr7b4\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc\" (UID: \"2548a3ec-5354-4309-a045-1a29253ad94b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.157822 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.426450 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc"] Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.511550 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" event={"ID":"2548a3ec-5354-4309-a045-1a29253ad94b","Type":"ContainerStarted","Data":"a24c5f3beee05d47e8967bcb2aa41538051c8b41a1cc85dd695c729c36d38334"} Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.736022 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.844167 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-config-volume\") pod \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\" (UID: \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\") " Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.844234 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-secret-volume\") pod \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\" (UID: \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\") " Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.844330 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbd7r\" (UniqueName: \"kubernetes.io/projected/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-kube-api-access-wbd7r\") pod \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\" (UID: \"b8a8b845-d9b0-4111-b9f2-01d31c27fefe\") " Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.845072 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-config-volume" (OuterVolumeSpecName: "config-volume") pod "b8a8b845-d9b0-4111-b9f2-01d31c27fefe" (UID: "b8a8b845-d9b0-4111-b9f2-01d31c27fefe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.850034 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b8a8b845-d9b0-4111-b9f2-01d31c27fefe" (UID: "b8a8b845-d9b0-4111-b9f2-01d31c27fefe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.850202 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-kube-api-access-wbd7r" (OuterVolumeSpecName: "kube-api-access-wbd7r") pod "b8a8b845-d9b0-4111-b9f2-01d31c27fefe" (UID: "b8a8b845-d9b0-4111-b9f2-01d31c27fefe"). InnerVolumeSpecName "kube-api-access-wbd7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.945711 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.945758 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:15:03 crc kubenswrapper[4867]: I1006 13:15:03.945769 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbd7r\" (UniqueName: \"kubernetes.io/projected/b8a8b845-d9b0-4111-b9f2-01d31c27fefe-kube-api-access-wbd7r\") on node \"crc\" DevicePath \"\"" Oct 06 13:15:04 crc kubenswrapper[4867]: I1006 13:15:04.519156 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" event={"ID":"b8a8b845-d9b0-4111-b9f2-01d31c27fefe","Type":"ContainerDied","Data":"41b406e3e6737e8becb91e76ea65fc67b6c220088aba022643bd566cd6dff115"} Oct 06 13:15:04 crc kubenswrapper[4867]: I1006 13:15:04.519458 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41b406e3e6737e8becb91e76ea65fc67b6c220088aba022643bd566cd6dff115" Oct 06 13:15:04 crc kubenswrapper[4867]: I1006 13:15:04.519203 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh" Oct 06 13:15:04 crc kubenswrapper[4867]: I1006 13:15:04.520943 4867 generic.go:334] "Generic (PLEG): container finished" podID="2548a3ec-5354-4309-a045-1a29253ad94b" containerID="cf2b7baf200beabc485e97c93f7b58c69576aa3844ec299ea52cc1010b65f1da" exitCode=0 Oct 06 13:15:04 crc kubenswrapper[4867]: I1006 13:15:04.521001 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" event={"ID":"2548a3ec-5354-4309-a045-1a29253ad94b","Type":"ContainerDied","Data":"cf2b7baf200beabc485e97c93f7b58c69576aa3844ec299ea52cc1010b65f1da"} Oct 06 13:15:07 crc kubenswrapper[4867]: I1006 13:15:07.537403 4867 generic.go:334] "Generic (PLEG): container finished" podID="2548a3ec-5354-4309-a045-1a29253ad94b" containerID="42eb065883e21a9a2084d9cbbd9aeedfd9e1f21b17c1bea09eb96e643b0f7b62" exitCode=0 Oct 06 13:15:07 crc kubenswrapper[4867]: I1006 13:15:07.537572 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" event={"ID":"2548a3ec-5354-4309-a045-1a29253ad94b","Type":"ContainerDied","Data":"42eb065883e21a9a2084d9cbbd9aeedfd9e1f21b17c1bea09eb96e643b0f7b62"} Oct 06 13:15:08 crc kubenswrapper[4867]: I1006 13:15:08.546389 4867 generic.go:334] "Generic (PLEG): container finished" podID="2548a3ec-5354-4309-a045-1a29253ad94b" containerID="29e53d5e40757f5445eb54252d55f43cad4862a489a93e48f50a530125c95922" exitCode=0 Oct 06 13:15:08 crc kubenswrapper[4867]: I1006 13:15:08.546466 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" event={"ID":"2548a3ec-5354-4309-a045-1a29253ad94b","Type":"ContainerDied","Data":"29e53d5e40757f5445eb54252d55f43cad4862a489a93e48f50a530125c95922"} Oct 06 13:15:09 crc kubenswrapper[4867]: I1006 13:15:09.763408 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" Oct 06 13:15:09 crc kubenswrapper[4867]: I1006 13:15:09.827289 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2548a3ec-5354-4309-a045-1a29253ad94b-bundle\") pod \"2548a3ec-5354-4309-a045-1a29253ad94b\" (UID: \"2548a3ec-5354-4309-a045-1a29253ad94b\") " Oct 06 13:15:09 crc kubenswrapper[4867]: I1006 13:15:09.827489 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr7b4\" (UniqueName: \"kubernetes.io/projected/2548a3ec-5354-4309-a045-1a29253ad94b-kube-api-access-rr7b4\") pod \"2548a3ec-5354-4309-a045-1a29253ad94b\" (UID: \"2548a3ec-5354-4309-a045-1a29253ad94b\") " Oct 06 13:15:09 crc kubenswrapper[4867]: I1006 13:15:09.827536 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2548a3ec-5354-4309-a045-1a29253ad94b-util\") pod \"2548a3ec-5354-4309-a045-1a29253ad94b\" (UID: \"2548a3ec-5354-4309-a045-1a29253ad94b\") " Oct 06 13:15:09 crc kubenswrapper[4867]: I1006 13:15:09.830569 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2548a3ec-5354-4309-a045-1a29253ad94b-bundle" (OuterVolumeSpecName: "bundle") pod "2548a3ec-5354-4309-a045-1a29253ad94b" (UID: "2548a3ec-5354-4309-a045-1a29253ad94b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:15:09 crc kubenswrapper[4867]: I1006 13:15:09.838524 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2548a3ec-5354-4309-a045-1a29253ad94b-util" (OuterVolumeSpecName: "util") pod "2548a3ec-5354-4309-a045-1a29253ad94b" (UID: "2548a3ec-5354-4309-a045-1a29253ad94b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:15:09 crc kubenswrapper[4867]: I1006 13:15:09.838636 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2548a3ec-5354-4309-a045-1a29253ad94b-kube-api-access-rr7b4" (OuterVolumeSpecName: "kube-api-access-rr7b4") pod "2548a3ec-5354-4309-a045-1a29253ad94b" (UID: "2548a3ec-5354-4309-a045-1a29253ad94b"). InnerVolumeSpecName "kube-api-access-rr7b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:15:09 crc kubenswrapper[4867]: I1006 13:15:09.929341 4867 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2548a3ec-5354-4309-a045-1a29253ad94b-util\") on node \"crc\" DevicePath \"\"" Oct 06 13:15:09 crc kubenswrapper[4867]: I1006 13:15:09.929417 4867 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2548a3ec-5354-4309-a045-1a29253ad94b-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:15:09 crc kubenswrapper[4867]: I1006 13:15:09.929442 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr7b4\" (UniqueName: \"kubernetes.io/projected/2548a3ec-5354-4309-a045-1a29253ad94b-kube-api-access-rr7b4\") on node \"crc\" DevicePath \"\"" Oct 06 13:15:10 crc kubenswrapper[4867]: I1006 13:15:10.560884 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" event={"ID":"2548a3ec-5354-4309-a045-1a29253ad94b","Type":"ContainerDied","Data":"a24c5f3beee05d47e8967bcb2aa41538051c8b41a1cc85dd695c729c36d38334"} Oct 06 13:15:10 crc kubenswrapper[4867]: I1006 13:15:10.560940 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a24c5f3beee05d47e8967bcb2aa41538051c8b41a1cc85dd695c729c36d38334" Oct 06 13:15:10 crc kubenswrapper[4867]: I1006 13:15:10.560957 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.377883 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-b6qnd"] Oct 06 13:15:14 crc kubenswrapper[4867]: E1006 13:15:14.378673 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2548a3ec-5354-4309-a045-1a29253ad94b" containerName="pull" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.378696 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2548a3ec-5354-4309-a045-1a29253ad94b" containerName="pull" Oct 06 13:15:14 crc kubenswrapper[4867]: E1006 13:15:14.378724 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2548a3ec-5354-4309-a045-1a29253ad94b" containerName="util" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.378733 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2548a3ec-5354-4309-a045-1a29253ad94b" containerName="util" Oct 06 13:15:14 crc kubenswrapper[4867]: E1006 13:15:14.378748 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a8b845-d9b0-4111-b9f2-01d31c27fefe" containerName="collect-profiles" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.378757 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a8b845-d9b0-4111-b9f2-01d31c27fefe" containerName="collect-profiles" Oct 06 13:15:14 crc kubenswrapper[4867]: E1006 13:15:14.378775 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2548a3ec-5354-4309-a045-1a29253ad94b" containerName="extract" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.378784 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2548a3ec-5354-4309-a045-1a29253ad94b" containerName="extract" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.378932 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a8b845-d9b0-4111-b9f2-01d31c27fefe" containerName="collect-profiles" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.378952 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2548a3ec-5354-4309-a045-1a29253ad94b" containerName="extract" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.379601 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-b6qnd" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.382082 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.382473 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-ncw59" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.386621 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.388522 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-b6qnd"] Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.494664 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdmlz\" (UniqueName: \"kubernetes.io/projected/1bed039e-de7f-49b2-b0fe-47e8bc055e8d-kube-api-access-pdmlz\") pod \"nmstate-operator-858ddd8f98-b6qnd\" (UID: \"1bed039e-de7f-49b2-b0fe-47e8bc055e8d\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-b6qnd" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.596416 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdmlz\" (UniqueName: \"kubernetes.io/projected/1bed039e-de7f-49b2-b0fe-47e8bc055e8d-kube-api-access-pdmlz\") pod \"nmstate-operator-858ddd8f98-b6qnd\" (UID: \"1bed039e-de7f-49b2-b0fe-47e8bc055e8d\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-b6qnd" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.615536 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdmlz\" (UniqueName: \"kubernetes.io/projected/1bed039e-de7f-49b2-b0fe-47e8bc055e8d-kube-api-access-pdmlz\") pod \"nmstate-operator-858ddd8f98-b6qnd\" (UID: \"1bed039e-de7f-49b2-b0fe-47e8bc055e8d\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-b6qnd" Oct 06 13:15:14 crc kubenswrapper[4867]: I1006 13:15:14.701353 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-b6qnd" Oct 06 13:15:15 crc kubenswrapper[4867]: I1006 13:15:15.104183 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-b6qnd"] Oct 06 13:15:15 crc kubenswrapper[4867]: I1006 13:15:15.592096 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-b6qnd" event={"ID":"1bed039e-de7f-49b2-b0fe-47e8bc055e8d","Type":"ContainerStarted","Data":"a7fc7cafbfacaf444c5385c8b9cbbbe29706c904444671dc86377c163f0cf29c"} Oct 06 13:15:18 crc kubenswrapper[4867]: I1006 13:15:18.611246 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-b6qnd" event={"ID":"1bed039e-de7f-49b2-b0fe-47e8bc055e8d","Type":"ContainerStarted","Data":"3d75644c38a257882b776589c8ad329ff76ee927733d0a210e68460a7417d98f"} Oct 06 13:15:18 crc kubenswrapper[4867]: I1006 13:15:18.627805 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-b6qnd" podStartSLOduration=1.948971082 podStartE2EDuration="4.627784583s" podCreationTimestamp="2025-10-06 13:15:14 +0000 UTC" firstStartedPulling="2025-10-06 13:15:15.127434744 +0000 UTC m=+694.585382888" lastFinishedPulling="2025-10-06 13:15:17.806248235 +0000 UTC m=+697.264196389" observedRunningTime="2025-10-06 13:15:18.624841543 +0000 UTC m=+698.082789697" watchObservedRunningTime="2025-10-06 13:15:18.627784583 +0000 UTC m=+698.085732727" Oct 06 13:15:26 crc kubenswrapper[4867]: I1006 13:15:26.949697 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-vdksl"] Oct 06 13:15:26 crc kubenswrapper[4867]: I1006 13:15:26.951480 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vdksl" Oct 06 13:15:26 crc kubenswrapper[4867]: I1006 13:15:26.955168 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-w7qln" Oct 06 13:15:26 crc kubenswrapper[4867]: I1006 13:15:26.959968 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-vdksl"] Oct 06 13:15:26 crc kubenswrapper[4867]: I1006 13:15:26.963498 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-fm444"] Oct 06 13:15:26 crc kubenswrapper[4867]: I1006 13:15:26.964332 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-fm444" Oct 06 13:15:26 crc kubenswrapper[4867]: I1006 13:15:26.966730 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 06 13:15:26 crc kubenswrapper[4867]: I1006 13:15:26.987298 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-fm444"] Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.007434 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-wvv72"] Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.008555 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.077040 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e19fdddd-1727-4c4b-985f-7548c278b0ca-dbus-socket\") pod \"nmstate-handler-wvv72\" (UID: \"e19fdddd-1727-4c4b-985f-7548c278b0ca\") " pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.077098 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5802445f-947f-4d52-b1f3-91f404ef0088-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-fm444\" (UID: \"5802445f-947f-4d52-b1f3-91f404ef0088\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-fm444" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.077131 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e19fdddd-1727-4c4b-985f-7548c278b0ca-nmstate-lock\") pod \"nmstate-handler-wvv72\" (UID: \"e19fdddd-1727-4c4b-985f-7548c278b0ca\") " pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.077357 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frvsp\" (UniqueName: \"kubernetes.io/projected/5802445f-947f-4d52-b1f3-91f404ef0088-kube-api-access-frvsp\") pod \"nmstate-webhook-6cdbc54649-fm444\" (UID: \"5802445f-947f-4d52-b1f3-91f404ef0088\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-fm444" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.077450 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e19fdddd-1727-4c4b-985f-7548c278b0ca-ovs-socket\") pod \"nmstate-handler-wvv72\" (UID: \"e19fdddd-1727-4c4b-985f-7548c278b0ca\") " pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.077492 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b42nf\" (UniqueName: \"kubernetes.io/projected/e19fdddd-1727-4c4b-985f-7548c278b0ca-kube-api-access-b42nf\") pod \"nmstate-handler-wvv72\" (UID: \"e19fdddd-1727-4c4b-985f-7548c278b0ca\") " pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.077625 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wckhn\" (UniqueName: \"kubernetes.io/projected/99a4464a-a11f-4a4e-86ae-43a9a76b060a-kube-api-access-wckhn\") pod \"nmstate-metrics-fdff9cb8d-vdksl\" (UID: \"99a4464a-a11f-4a4e-86ae-43a9a76b060a\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vdksl" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.104815 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk"] Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.105648 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.108105 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.108377 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.111012 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-m5qq2" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.122209 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk"] Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.179600 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frvsp\" (UniqueName: \"kubernetes.io/projected/5802445f-947f-4d52-b1f3-91f404ef0088-kube-api-access-frvsp\") pod \"nmstate-webhook-6cdbc54649-fm444\" (UID: \"5802445f-947f-4d52-b1f3-91f404ef0088\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-fm444" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.179661 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhjfd\" (UniqueName: \"kubernetes.io/projected/fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83-kube-api-access-hhjfd\") pod \"nmstate-console-plugin-6b874cbd85-trjgk\" (UID: \"fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.179703 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e19fdddd-1727-4c4b-985f-7548c278b0ca-ovs-socket\") pod \"nmstate-handler-wvv72\" (UID: \"e19fdddd-1727-4c4b-985f-7548c278b0ca\") " pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.179725 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b42nf\" (UniqueName: \"kubernetes.io/projected/e19fdddd-1727-4c4b-985f-7548c278b0ca-kube-api-access-b42nf\") pod \"nmstate-handler-wvv72\" (UID: \"e19fdddd-1727-4c4b-985f-7548c278b0ca\") " pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.179754 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wckhn\" (UniqueName: \"kubernetes.io/projected/99a4464a-a11f-4a4e-86ae-43a9a76b060a-kube-api-access-wckhn\") pod \"nmstate-metrics-fdff9cb8d-vdksl\" (UID: \"99a4464a-a11f-4a4e-86ae-43a9a76b060a\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vdksl" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.179791 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-trjgk\" (UID: \"fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.179816 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e19fdddd-1727-4c4b-985f-7548c278b0ca-dbus-socket\") pod \"nmstate-handler-wvv72\" (UID: \"e19fdddd-1727-4c4b-985f-7548c278b0ca\") " pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.179842 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-trjgk\" (UID: \"fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.179866 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5802445f-947f-4d52-b1f3-91f404ef0088-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-fm444\" (UID: \"5802445f-947f-4d52-b1f3-91f404ef0088\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-fm444" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.179893 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e19fdddd-1727-4c4b-985f-7548c278b0ca-nmstate-lock\") pod \"nmstate-handler-wvv72\" (UID: \"e19fdddd-1727-4c4b-985f-7548c278b0ca\") " pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.179961 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e19fdddd-1727-4c4b-985f-7548c278b0ca-nmstate-lock\") pod \"nmstate-handler-wvv72\" (UID: \"e19fdddd-1727-4c4b-985f-7548c278b0ca\") " pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.180279 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e19fdddd-1727-4c4b-985f-7548c278b0ca-dbus-socket\") pod \"nmstate-handler-wvv72\" (UID: \"e19fdddd-1727-4c4b-985f-7548c278b0ca\") " pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:27 crc kubenswrapper[4867]: E1006 13:15:27.180359 4867 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.180384 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e19fdddd-1727-4c4b-985f-7548c278b0ca-ovs-socket\") pod \"nmstate-handler-wvv72\" (UID: \"e19fdddd-1727-4c4b-985f-7548c278b0ca\") " pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:27 crc kubenswrapper[4867]: E1006 13:15:27.180454 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5802445f-947f-4d52-b1f3-91f404ef0088-tls-key-pair podName:5802445f-947f-4d52-b1f3-91f404ef0088 nodeName:}" failed. No retries permitted until 2025-10-06 13:15:27.680428459 +0000 UTC m=+707.138376603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/5802445f-947f-4d52-b1f3-91f404ef0088-tls-key-pair") pod "nmstate-webhook-6cdbc54649-fm444" (UID: "5802445f-947f-4d52-b1f3-91f404ef0088") : secret "openshift-nmstate-webhook" not found Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.198751 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b42nf\" (UniqueName: \"kubernetes.io/projected/e19fdddd-1727-4c4b-985f-7548c278b0ca-kube-api-access-b42nf\") pod \"nmstate-handler-wvv72\" (UID: \"e19fdddd-1727-4c4b-985f-7548c278b0ca\") " pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.199040 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frvsp\" (UniqueName: \"kubernetes.io/projected/5802445f-947f-4d52-b1f3-91f404ef0088-kube-api-access-frvsp\") pod \"nmstate-webhook-6cdbc54649-fm444\" (UID: \"5802445f-947f-4d52-b1f3-91f404ef0088\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-fm444" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.199436 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wckhn\" (UniqueName: \"kubernetes.io/projected/99a4464a-a11f-4a4e-86ae-43a9a76b060a-kube-api-access-wckhn\") pod \"nmstate-metrics-fdff9cb8d-vdksl\" (UID: \"99a4464a-a11f-4a4e-86ae-43a9a76b060a\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vdksl" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.268009 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vdksl" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.281137 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-trjgk\" (UID: \"fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.281184 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-trjgk\" (UID: \"fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.281299 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhjfd\" (UniqueName: \"kubernetes.io/projected/fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83-kube-api-access-hhjfd\") pod \"nmstate-console-plugin-6b874cbd85-trjgk\" (UID: \"fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.283360 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-trjgk\" (UID: \"fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.285345 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-trjgk\" (UID: \"fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.312645 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhjfd\" (UniqueName: \"kubernetes.io/projected/fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83-kube-api-access-hhjfd\") pod \"nmstate-console-plugin-6b874cbd85-trjgk\" (UID: \"fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.322595 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.339722 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7784897869-654j4"] Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.340927 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.356145 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7784897869-654j4"] Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.420175 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.484805 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdbd6148-8181-41c9-b578-2854674e5252-console-serving-cert\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.484870 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdbd6148-8181-41c9-b578-2854674e5252-oauth-serving-cert\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.485006 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdbd6148-8181-41c9-b578-2854674e5252-console-config\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.485056 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdbd6148-8181-41c9-b578-2854674e5252-console-oauth-config\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.485436 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crprm\" (UniqueName: \"kubernetes.io/projected/cdbd6148-8181-41c9-b578-2854674e5252-kube-api-access-crprm\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.485483 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdbd6148-8181-41c9-b578-2854674e5252-service-ca\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.485579 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdbd6148-8181-41c9-b578-2854674e5252-trusted-ca-bundle\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.521587 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-vdksl"] Oct 06 13:15:27 crc kubenswrapper[4867]: W1006 13:15:27.525650 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99a4464a_a11f_4a4e_86ae_43a9a76b060a.slice/crio-ae3f6e79a824a92cd8fb2bd076dcb432d5bc9549eb374a0c2b81d11f7d87eab3 WatchSource:0}: Error finding container ae3f6e79a824a92cd8fb2bd076dcb432d5bc9549eb374a0c2b81d11f7d87eab3: Status 404 returned error can't find the container with id ae3f6e79a824a92cd8fb2bd076dcb432d5bc9549eb374a0c2b81d11f7d87eab3 Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.586587 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdbd6148-8181-41c9-b578-2854674e5252-console-config\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.586639 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdbd6148-8181-41c9-b578-2854674e5252-console-oauth-config\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.588034 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdbd6148-8181-41c9-b578-2854674e5252-console-config\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.588075 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crprm\" (UniqueName: \"kubernetes.io/projected/cdbd6148-8181-41c9-b578-2854674e5252-kube-api-access-crprm\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.588139 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdbd6148-8181-41c9-b578-2854674e5252-service-ca\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.588184 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdbd6148-8181-41c9-b578-2854674e5252-trusted-ca-bundle\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.588227 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdbd6148-8181-41c9-b578-2854674e5252-console-serving-cert\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.588275 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdbd6148-8181-41c9-b578-2854674e5252-oauth-serving-cert\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.588811 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdbd6148-8181-41c9-b578-2854674e5252-oauth-serving-cert\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.590567 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdbd6148-8181-41c9-b578-2854674e5252-service-ca\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.590821 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdbd6148-8181-41c9-b578-2854674e5252-trusted-ca-bundle\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.593215 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdbd6148-8181-41c9-b578-2854674e5252-console-oauth-config\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.594662 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdbd6148-8181-41c9-b578-2854674e5252-console-serving-cert\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.607202 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crprm\" (UniqueName: \"kubernetes.io/projected/cdbd6148-8181-41c9-b578-2854674e5252-kube-api-access-crprm\") pod \"console-7784897869-654j4\" (UID: \"cdbd6148-8181-41c9-b578-2854674e5252\") " pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.614854 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk"] Oct 06 13:15:27 crc kubenswrapper[4867]: W1006 13:15:27.615837 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa0bd3d7_281e_4e5d_adef_b6c38e2d6d83.slice/crio-1f58a6d141d217f545bf379230c7ee6481a13109013a488ca85dd5da6dbb288a WatchSource:0}: Error finding container 1f58a6d141d217f545bf379230c7ee6481a13109013a488ca85dd5da6dbb288a: Status 404 returned error can't find the container with id 1f58a6d141d217f545bf379230c7ee6481a13109013a488ca85dd5da6dbb288a Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.682199 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wvv72" event={"ID":"e19fdddd-1727-4c4b-985f-7548c278b0ca","Type":"ContainerStarted","Data":"1f835c2b82028c532145081b21414607bed77d80b7fc6905d9a83eee1e4ef4a2"} Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.683439 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vdksl" event={"ID":"99a4464a-a11f-4a4e-86ae-43a9a76b060a","Type":"ContainerStarted","Data":"ae3f6e79a824a92cd8fb2bd076dcb432d5bc9549eb374a0c2b81d11f7d87eab3"} Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.685814 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk" event={"ID":"fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83","Type":"ContainerStarted","Data":"1f58a6d141d217f545bf379230c7ee6481a13109013a488ca85dd5da6dbb288a"} Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.687755 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.690178 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5802445f-947f-4d52-b1f3-91f404ef0088-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-fm444\" (UID: \"5802445f-947f-4d52-b1f3-91f404ef0088\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-fm444" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.694456 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5802445f-947f-4d52-b1f3-91f404ef0088-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-fm444\" (UID: \"5802445f-947f-4d52-b1f3-91f404ef0088\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-fm444" Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.859065 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7784897869-654j4"] Oct 06 13:15:27 crc kubenswrapper[4867]: W1006 13:15:27.867214 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdbd6148_8181_41c9_b578_2854674e5252.slice/crio-0fb2e91e36e0663214d6eccee70ed965c98d5d4c2bd6f6ee151d4756f1f016e4 WatchSource:0}: Error finding container 0fb2e91e36e0663214d6eccee70ed965c98d5d4c2bd6f6ee151d4756f1f016e4: Status 404 returned error can't find the container with id 0fb2e91e36e0663214d6eccee70ed965c98d5d4c2bd6f6ee151d4756f1f016e4 Oct 06 13:15:27 crc kubenswrapper[4867]: I1006 13:15:27.882587 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-fm444" Oct 06 13:15:28 crc kubenswrapper[4867]: I1006 13:15:28.062600 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-fm444"] Oct 06 13:15:28 crc kubenswrapper[4867]: I1006 13:15:28.694436 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7784897869-654j4" event={"ID":"cdbd6148-8181-41c9-b578-2854674e5252","Type":"ContainerStarted","Data":"ebeb19e48a29d46aca9482a5cff3167a9ac3d2a36440b8263eb6c6105b7029fa"} Oct 06 13:15:28 crc kubenswrapper[4867]: I1006 13:15:28.694770 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7784897869-654j4" event={"ID":"cdbd6148-8181-41c9-b578-2854674e5252","Type":"ContainerStarted","Data":"0fb2e91e36e0663214d6eccee70ed965c98d5d4c2bd6f6ee151d4756f1f016e4"} Oct 06 13:15:28 crc kubenswrapper[4867]: I1006 13:15:28.696954 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-fm444" event={"ID":"5802445f-947f-4d52-b1f3-91f404ef0088","Type":"ContainerStarted","Data":"d2f72c7a3d48b1612b667096c7157ddb731e8d693f8f7ec6b0c698a7bc47fb7c"} Oct 06 13:15:28 crc kubenswrapper[4867]: I1006 13:15:28.719201 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7784897869-654j4" podStartSLOduration=1.719167457 podStartE2EDuration="1.719167457s" podCreationTimestamp="2025-10-06 13:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:15:28.712549217 +0000 UTC m=+708.170497381" watchObservedRunningTime="2025-10-06 13:15:28.719167457 +0000 UTC m=+708.177115591" Oct 06 13:15:31 crc kubenswrapper[4867]: I1006 13:15:31.720395 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-fm444" event={"ID":"5802445f-947f-4d52-b1f3-91f404ef0088","Type":"ContainerStarted","Data":"351fe6d1d32905de83d416928f77af7e38bcf83bc97c5129e8e95ad1283993f7"} Oct 06 13:15:31 crc kubenswrapper[4867]: I1006 13:15:31.721056 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-fm444" Oct 06 13:15:31 crc kubenswrapper[4867]: I1006 13:15:31.723416 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vdksl" event={"ID":"99a4464a-a11f-4a4e-86ae-43a9a76b060a","Type":"ContainerStarted","Data":"663bf797a70395dc13cf16b566c4cfea871d7a8a9c704502be301d077f9632ff"} Oct 06 13:15:31 crc kubenswrapper[4867]: I1006 13:15:31.725662 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk" event={"ID":"fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83","Type":"ContainerStarted","Data":"f88195c067f03d2794d3b14cbd9757904bacaafdd7b94f9b6972d4e665d8a455"} Oct 06 13:15:31 crc kubenswrapper[4867]: I1006 13:15:31.727298 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wvv72" event={"ID":"e19fdddd-1727-4c4b-985f-7548c278b0ca","Type":"ContainerStarted","Data":"2a48cd9f0a0a7f13db0863e138dd37bfea81ab8b9462c8ed685039506e0e9efa"} Oct 06 13:15:31 crc kubenswrapper[4867]: I1006 13:15:31.727474 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:31 crc kubenswrapper[4867]: I1006 13:15:31.744587 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-fm444" podStartSLOduration=3.119104547 podStartE2EDuration="5.744560745s" podCreationTimestamp="2025-10-06 13:15:26 +0000 UTC" firstStartedPulling="2025-10-06 13:15:28.073905578 +0000 UTC m=+707.531853722" lastFinishedPulling="2025-10-06 13:15:30.699361776 +0000 UTC m=+710.157309920" observedRunningTime="2025-10-06 13:15:31.741205574 +0000 UTC m=+711.199153728" watchObservedRunningTime="2025-10-06 13:15:31.744560745 +0000 UTC m=+711.202508889" Oct 06 13:15:31 crc kubenswrapper[4867]: I1006 13:15:31.762239 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-wvv72" podStartSLOduration=2.435699959 podStartE2EDuration="5.762213396s" podCreationTimestamp="2025-10-06 13:15:26 +0000 UTC" firstStartedPulling="2025-10-06 13:15:27.360495992 +0000 UTC m=+706.818444126" lastFinishedPulling="2025-10-06 13:15:30.687009429 +0000 UTC m=+710.144957563" observedRunningTime="2025-10-06 13:15:31.761054344 +0000 UTC m=+711.219002508" watchObservedRunningTime="2025-10-06 13:15:31.762213396 +0000 UTC m=+711.220161540" Oct 06 13:15:31 crc kubenswrapper[4867]: I1006 13:15:31.780024 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-trjgk" podStartSLOduration=1.711419767 podStartE2EDuration="4.77999944s" podCreationTimestamp="2025-10-06 13:15:27 +0000 UTC" firstStartedPulling="2025-10-06 13:15:27.618396415 +0000 UTC m=+707.076344559" lastFinishedPulling="2025-10-06 13:15:30.686976088 +0000 UTC m=+710.144924232" observedRunningTime="2025-10-06 13:15:31.778920161 +0000 UTC m=+711.236868305" watchObservedRunningTime="2025-10-06 13:15:31.77999944 +0000 UTC m=+711.237947584" Oct 06 13:15:33 crc kubenswrapper[4867]: I1006 13:15:33.742814 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vdksl" event={"ID":"99a4464a-a11f-4a4e-86ae-43a9a76b060a","Type":"ContainerStarted","Data":"dfc38144ec1a227e470544f1b6cda38185d811b01ccdc3fd16ea216f6d12a061"} Oct 06 13:15:33 crc kubenswrapper[4867]: I1006 13:15:33.762607 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-vdksl" podStartSLOduration=1.904388362 podStartE2EDuration="7.762580472s" podCreationTimestamp="2025-10-06 13:15:26 +0000 UTC" firstStartedPulling="2025-10-06 13:15:27.528647931 +0000 UTC m=+706.986596075" lastFinishedPulling="2025-10-06 13:15:33.386840041 +0000 UTC m=+712.844788185" observedRunningTime="2025-10-06 13:15:33.758327096 +0000 UTC m=+713.216275240" watchObservedRunningTime="2025-10-06 13:15:33.762580472 +0000 UTC m=+713.220528626" Oct 06 13:15:37 crc kubenswrapper[4867]: I1006 13:15:37.345564 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-wvv72" Oct 06 13:15:37 crc kubenswrapper[4867]: I1006 13:15:37.688354 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:37 crc kubenswrapper[4867]: I1006 13:15:37.688755 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:37 crc kubenswrapper[4867]: I1006 13:15:37.697058 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:37 crc kubenswrapper[4867]: I1006 13:15:37.775831 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7784897869-654j4" Oct 06 13:15:37 crc kubenswrapper[4867]: I1006 13:15:37.849458 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rqnc4"] Oct 06 13:15:47 crc kubenswrapper[4867]: I1006 13:15:47.888697 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-fm444" Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.572530 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l"] Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.574310 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.576531 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.587366 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l"] Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.676678 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l\" (UID: \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.676768 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l\" (UID: \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.676803 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg6cc\" (UniqueName: \"kubernetes.io/projected/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-kube-api-access-xg6cc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l\" (UID: \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.778240 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l\" (UID: \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.778325 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg6cc\" (UniqueName: \"kubernetes.io/projected/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-kube-api-access-xg6cc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l\" (UID: \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.778466 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l\" (UID: \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.778985 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l\" (UID: \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.779002 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l\" (UID: \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.798664 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg6cc\" (UniqueName: \"kubernetes.io/projected/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-kube-api-access-xg6cc\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l\" (UID: \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.891009 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" Oct 06 13:16:02 crc kubenswrapper[4867]: I1006 13:16:02.898319 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rqnc4" podUID="a85dd45a-f972-4cb7-aa77-e2f8468df1cf" containerName="console" containerID="cri-o://1d355abbdeacdf024a119e2ba3c84a14285e1ff689b3146c12ffb2a0849b2923" gracePeriod=15 Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.265596 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rqnc4_a85dd45a-f972-4cb7-aa77-e2f8468df1cf/console/0.log" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.266127 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.320416 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l"] Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.387461 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-oauth-serving-cert\") pod \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.387889 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-service-ca\") pod \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.387913 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-oauth-config\") pod \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.387942 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw52g\" (UniqueName: \"kubernetes.io/projected/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-kube-api-access-dw52g\") pod \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.387971 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-config\") pod \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.388051 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-trusted-ca-bundle\") pod \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.388081 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-serving-cert\") pod \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\" (UID: \"a85dd45a-f972-4cb7-aa77-e2f8468df1cf\") " Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.388489 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a85dd45a-f972-4cb7-aa77-e2f8468df1cf" (UID: "a85dd45a-f972-4cb7-aa77-e2f8468df1cf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.388567 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-config" (OuterVolumeSpecName: "console-config") pod "a85dd45a-f972-4cb7-aa77-e2f8468df1cf" (UID: "a85dd45a-f972-4cb7-aa77-e2f8468df1cf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.388834 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a85dd45a-f972-4cb7-aa77-e2f8468df1cf" (UID: "a85dd45a-f972-4cb7-aa77-e2f8468df1cf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.389084 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-service-ca" (OuterVolumeSpecName: "service-ca") pod "a85dd45a-f972-4cb7-aa77-e2f8468df1cf" (UID: "a85dd45a-f972-4cb7-aa77-e2f8468df1cf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.393716 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-kube-api-access-dw52g" (OuterVolumeSpecName: "kube-api-access-dw52g") pod "a85dd45a-f972-4cb7-aa77-e2f8468df1cf" (UID: "a85dd45a-f972-4cb7-aa77-e2f8468df1cf"). InnerVolumeSpecName "kube-api-access-dw52g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.393840 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a85dd45a-f972-4cb7-aa77-e2f8468df1cf" (UID: "a85dd45a-f972-4cb7-aa77-e2f8468df1cf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.393949 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a85dd45a-f972-4cb7-aa77-e2f8468df1cf" (UID: "a85dd45a-f972-4cb7-aa77-e2f8468df1cf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.490149 4867 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.490193 4867 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.490205 4867 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.490216 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw52g\" (UniqueName: \"kubernetes.io/projected/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-kube-api-access-dw52g\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.490226 4867 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.490236 4867 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.490245 4867 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a85dd45a-f972-4cb7-aa77-e2f8468df1cf-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.955016 4867 generic.go:334] "Generic (PLEG): container finished" podID="4ab89d13-c239-4d47-aa11-68d1ea20e6b1" containerID="bad2ec9124a4c8e05bc7895ac7b4ed70962ccd2658d90adb590ada323ee9b34e" exitCode=0 Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.955104 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" event={"ID":"4ab89d13-c239-4d47-aa11-68d1ea20e6b1","Type":"ContainerDied","Data":"bad2ec9124a4c8e05bc7895ac7b4ed70962ccd2658d90adb590ada323ee9b34e"} Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.955134 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" event={"ID":"4ab89d13-c239-4d47-aa11-68d1ea20e6b1","Type":"ContainerStarted","Data":"a192d5a5b969d6795e8bb7b8aaf8fb79af5546baac4692ae60abd20b0dd8ba45"} Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.957132 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rqnc4_a85dd45a-f972-4cb7-aa77-e2f8468df1cf/console/0.log" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.957159 4867 generic.go:334] "Generic (PLEG): container finished" podID="a85dd45a-f972-4cb7-aa77-e2f8468df1cf" containerID="1d355abbdeacdf024a119e2ba3c84a14285e1ff689b3146c12ffb2a0849b2923" exitCode=2 Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.957177 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rqnc4" event={"ID":"a85dd45a-f972-4cb7-aa77-e2f8468df1cf","Type":"ContainerDied","Data":"1d355abbdeacdf024a119e2ba3c84a14285e1ff689b3146c12ffb2a0849b2923"} Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.957223 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rqnc4" event={"ID":"a85dd45a-f972-4cb7-aa77-e2f8468df1cf","Type":"ContainerDied","Data":"a6fb20885bc0e5c7d1b5eb63ef41fb249b47eb0aa91a5afb9019f71b5a93c690"} Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.957242 4867 scope.go:117] "RemoveContainer" containerID="1d355abbdeacdf024a119e2ba3c84a14285e1ff689b3146c12ffb2a0849b2923" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.957390 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rqnc4" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.979538 4867 scope.go:117] "RemoveContainer" containerID="1d355abbdeacdf024a119e2ba3c84a14285e1ff689b3146c12ffb2a0849b2923" Oct 06 13:16:03 crc kubenswrapper[4867]: E1006 13:16:03.980121 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d355abbdeacdf024a119e2ba3c84a14285e1ff689b3146c12ffb2a0849b2923\": container with ID starting with 1d355abbdeacdf024a119e2ba3c84a14285e1ff689b3146c12ffb2a0849b2923 not found: ID does not exist" containerID="1d355abbdeacdf024a119e2ba3c84a14285e1ff689b3146c12ffb2a0849b2923" Oct 06 13:16:03 crc kubenswrapper[4867]: I1006 13:16:03.980165 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d355abbdeacdf024a119e2ba3c84a14285e1ff689b3146c12ffb2a0849b2923"} err="failed to get container status \"1d355abbdeacdf024a119e2ba3c84a14285e1ff689b3146c12ffb2a0849b2923\": rpc error: code = NotFound desc = could not find container \"1d355abbdeacdf024a119e2ba3c84a14285e1ff689b3146c12ffb2a0849b2923\": container with ID starting with 1d355abbdeacdf024a119e2ba3c84a14285e1ff689b3146c12ffb2a0849b2923 not found: ID does not exist" Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.007306 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rqnc4"] Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.016017 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rqnc4"] Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.370997 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j646b"] Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.371691 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" podUID="674661a9-9e17-4d57-b887-8294a70fdcad" containerName="controller-manager" containerID="cri-o://d0b04e645424aee6037bf0a223bd74d3a3da7aa653d581974cebb1a82097b4e9" gracePeriod=30 Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.473236 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l"] Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.473546 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" podUID="0bd3770c-b6a9-42a3-9530-30ef3b90c7ca" containerName="route-controller-manager" containerID="cri-o://f23823ef72bc471ab9f68c0ba8f8a908d55659711ccb2cca38713ef9c8d50c99" gracePeriod=30 Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.897032 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.965277 4867 generic.go:334] "Generic (PLEG): container finished" podID="0bd3770c-b6a9-42a3-9530-30ef3b90c7ca" containerID="f23823ef72bc471ab9f68c0ba8f8a908d55659711ccb2cca38713ef9c8d50c99" exitCode=0 Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.965361 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" event={"ID":"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca","Type":"ContainerDied","Data":"f23823ef72bc471ab9f68c0ba8f8a908d55659711ccb2cca38713ef9c8d50c99"} Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.965398 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" event={"ID":"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca","Type":"ContainerDied","Data":"88b79c911c935f4f988a317f8104812bebf557fd2047b5d70345c24dba53d3c3"} Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.965410 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88b79c911c935f4f988a317f8104812bebf557fd2047b5d70345c24dba53d3c3" Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.968284 4867 generic.go:334] "Generic (PLEG): container finished" podID="674661a9-9e17-4d57-b887-8294a70fdcad" containerID="d0b04e645424aee6037bf0a223bd74d3a3da7aa653d581974cebb1a82097b4e9" exitCode=0 Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.968320 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" event={"ID":"674661a9-9e17-4d57-b887-8294a70fdcad","Type":"ContainerDied","Data":"d0b04e645424aee6037bf0a223bd74d3a3da7aa653d581974cebb1a82097b4e9"} Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.968338 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" event={"ID":"674661a9-9e17-4d57-b887-8294a70fdcad","Type":"ContainerDied","Data":"d48ba02f827af1b199fe5e223c30ebdefedb3e99e196f5e645bfd10711d33fbc"} Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.968355 4867 scope.go:117] "RemoveContainer" containerID="d0b04e645424aee6037bf0a223bd74d3a3da7aa653d581974cebb1a82097b4e9" Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.968429 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j646b" Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.990147 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.996304 4867 scope.go:117] "RemoveContainer" containerID="d0b04e645424aee6037bf0a223bd74d3a3da7aa653d581974cebb1a82097b4e9" Oct 06 13:16:04 crc kubenswrapper[4867]: E1006 13:16:04.996688 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0b04e645424aee6037bf0a223bd74d3a3da7aa653d581974cebb1a82097b4e9\": container with ID starting with d0b04e645424aee6037bf0a223bd74d3a3da7aa653d581974cebb1a82097b4e9 not found: ID does not exist" containerID="d0b04e645424aee6037bf0a223bd74d3a3da7aa653d581974cebb1a82097b4e9" Oct 06 13:16:04 crc kubenswrapper[4867]: I1006 13:16:04.996741 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0b04e645424aee6037bf0a223bd74d3a3da7aa653d581974cebb1a82097b4e9"} err="failed to get container status \"d0b04e645424aee6037bf0a223bd74d3a3da7aa653d581974cebb1a82097b4e9\": rpc error: code = NotFound desc = could not find container \"d0b04e645424aee6037bf0a223bd74d3a3da7aa653d581974cebb1a82097b4e9\": container with ID starting with d0b04e645424aee6037bf0a223bd74d3a3da7aa653d581974cebb1a82097b4e9 not found: ID does not exist" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.025690 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-config\") pod \"674661a9-9e17-4d57-b887-8294a70fdcad\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.025760 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-proxy-ca-bundles\") pod \"674661a9-9e17-4d57-b887-8294a70fdcad\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.025843 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/674661a9-9e17-4d57-b887-8294a70fdcad-serving-cert\") pod \"674661a9-9e17-4d57-b887-8294a70fdcad\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.025888 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbc6c\" (UniqueName: \"kubernetes.io/projected/674661a9-9e17-4d57-b887-8294a70fdcad-kube-api-access-nbc6c\") pod \"674661a9-9e17-4d57-b887-8294a70fdcad\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.025993 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-client-ca\") pod \"674661a9-9e17-4d57-b887-8294a70fdcad\" (UID: \"674661a9-9e17-4d57-b887-8294a70fdcad\") " Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.026693 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "674661a9-9e17-4d57-b887-8294a70fdcad" (UID: "674661a9-9e17-4d57-b887-8294a70fdcad"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.026907 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-config" (OuterVolumeSpecName: "config") pod "674661a9-9e17-4d57-b887-8294a70fdcad" (UID: "674661a9-9e17-4d57-b887-8294a70fdcad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.027746 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-client-ca" (OuterVolumeSpecName: "client-ca") pod "674661a9-9e17-4d57-b887-8294a70fdcad" (UID: "674661a9-9e17-4d57-b887-8294a70fdcad"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.034024 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/674661a9-9e17-4d57-b887-8294a70fdcad-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "674661a9-9e17-4d57-b887-8294a70fdcad" (UID: "674661a9-9e17-4d57-b887-8294a70fdcad"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.036795 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/674661a9-9e17-4d57-b887-8294a70fdcad-kube-api-access-nbc6c" (OuterVolumeSpecName: "kube-api-access-nbc6c") pod "674661a9-9e17-4d57-b887-8294a70fdcad" (UID: "674661a9-9e17-4d57-b887-8294a70fdcad"). InnerVolumeSpecName "kube-api-access-nbc6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.127832 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-client-ca\") pod \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.127959 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l85p\" (UniqueName: \"kubernetes.io/projected/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-kube-api-access-2l85p\") pod \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.127997 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-config\") pod \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.128025 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-serving-cert\") pod \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\" (UID: \"0bd3770c-b6a9-42a3-9530-30ef3b90c7ca\") " Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.128351 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.128366 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.128374 4867 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/674661a9-9e17-4d57-b887-8294a70fdcad-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.128385 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/674661a9-9e17-4d57-b887-8294a70fdcad-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.128394 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbc6c\" (UniqueName: \"kubernetes.io/projected/674661a9-9e17-4d57-b887-8294a70fdcad-kube-api-access-nbc6c\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.129822 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-client-ca" (OuterVolumeSpecName: "client-ca") pod "0bd3770c-b6a9-42a3-9530-30ef3b90c7ca" (UID: "0bd3770c-b6a9-42a3-9530-30ef3b90c7ca"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.130043 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-config" (OuterVolumeSpecName: "config") pod "0bd3770c-b6a9-42a3-9530-30ef3b90c7ca" (UID: "0bd3770c-b6a9-42a3-9530-30ef3b90c7ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.132209 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0bd3770c-b6a9-42a3-9530-30ef3b90c7ca" (UID: "0bd3770c-b6a9-42a3-9530-30ef3b90c7ca"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.132945 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-kube-api-access-2l85p" (OuterVolumeSpecName: "kube-api-access-2l85p") pod "0bd3770c-b6a9-42a3-9530-30ef3b90c7ca" (UID: "0bd3770c-b6a9-42a3-9530-30ef3b90c7ca"). InnerVolumeSpecName "kube-api-access-2l85p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.229859 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a85dd45a-f972-4cb7-aa77-e2f8468df1cf" path="/var/lib/kubelet/pods/a85dd45a-f972-4cb7-aa77-e2f8468df1cf/volumes" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.229947 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l85p\" (UniqueName: \"kubernetes.io/projected/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-kube-api-access-2l85p\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.230373 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.230421 4867 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.230451 4867 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.295485 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j646b"] Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.300419 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j646b"] Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.981837 4867 generic.go:334] "Generic (PLEG): container finished" podID="4ab89d13-c239-4d47-aa11-68d1ea20e6b1" containerID="14650af89dbbbe8b83ba1641134973fa4f59fe439f327bd318ffdb2587cc4e6b" exitCode=0 Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.981960 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" event={"ID":"4ab89d13-c239-4d47-aa11-68d1ea20e6b1","Type":"ContainerDied","Data":"14650af89dbbbe8b83ba1641134973fa4f59fe439f327bd318ffdb2587cc4e6b"} Oct 06 13:16:05 crc kubenswrapper[4867]: I1006 13:16:05.982411 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.017334 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz"] Oct 06 13:16:06 crc kubenswrapper[4867]: E1006 13:16:06.017667 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bd3770c-b6a9-42a3-9530-30ef3b90c7ca" containerName="route-controller-manager" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.017681 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd3770c-b6a9-42a3-9530-30ef3b90c7ca" containerName="route-controller-manager" Oct 06 13:16:06 crc kubenswrapper[4867]: E1006 13:16:06.017690 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674661a9-9e17-4d57-b887-8294a70fdcad" containerName="controller-manager" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.017697 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="674661a9-9e17-4d57-b887-8294a70fdcad" containerName="controller-manager" Oct 06 13:16:06 crc kubenswrapper[4867]: E1006 13:16:06.017708 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85dd45a-f972-4cb7-aa77-e2f8468df1cf" containerName="console" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.017714 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85dd45a-f972-4cb7-aa77-e2f8468df1cf" containerName="console" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.017834 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bd3770c-b6a9-42a3-9530-30ef3b90c7ca" containerName="route-controller-manager" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.017845 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85dd45a-f972-4cb7-aa77-e2f8468df1cf" containerName="console" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.017853 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="674661a9-9e17-4d57-b887-8294a70fdcad" containerName="controller-manager" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.018358 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.022712 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.022926 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.023046 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.023157 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.023262 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.023452 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.028317 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4"] Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.029440 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.031839 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.033301 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l"] Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.034109 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.037541 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.038106 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m6n6l"] Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.042413 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.043546 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.043997 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.045211 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4"] Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.049319 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.060172 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz"] Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.142895 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21c82feb-58d6-41b0-b9b3-122e607cc022-serving-cert\") pod \"route-controller-manager-f4f549b66-c4dfz\" (UID: \"21c82feb-58d6-41b0-b9b3-122e607cc022\") " pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.143050 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggjtt\" (UniqueName: \"kubernetes.io/projected/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-kube-api-access-ggjtt\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.143117 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-proxy-ca-bundles\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.143172 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c82feb-58d6-41b0-b9b3-122e607cc022-config\") pod \"route-controller-manager-f4f549b66-c4dfz\" (UID: \"21c82feb-58d6-41b0-b9b3-122e607cc022\") " pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.143205 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-client-ca\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.143357 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-config\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.143867 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21c82feb-58d6-41b0-b9b3-122e607cc022-client-ca\") pod \"route-controller-manager-f4f549b66-c4dfz\" (UID: \"21c82feb-58d6-41b0-b9b3-122e607cc022\") " pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.146036 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zxd2\" (UniqueName: \"kubernetes.io/projected/21c82feb-58d6-41b0-b9b3-122e607cc022-kube-api-access-7zxd2\") pod \"route-controller-manager-f4f549b66-c4dfz\" (UID: \"21c82feb-58d6-41b0-b9b3-122e607cc022\") " pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.146093 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-serving-cert\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.247324 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggjtt\" (UniqueName: \"kubernetes.io/projected/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-kube-api-access-ggjtt\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.247381 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-proxy-ca-bundles\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.247403 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c82feb-58d6-41b0-b9b3-122e607cc022-config\") pod \"route-controller-manager-f4f549b66-c4dfz\" (UID: \"21c82feb-58d6-41b0-b9b3-122e607cc022\") " pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.247423 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-client-ca\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.247453 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-config\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.247468 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21c82feb-58d6-41b0-b9b3-122e607cc022-client-ca\") pod \"route-controller-manager-f4f549b66-c4dfz\" (UID: \"21c82feb-58d6-41b0-b9b3-122e607cc022\") " pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.247504 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zxd2\" (UniqueName: \"kubernetes.io/projected/21c82feb-58d6-41b0-b9b3-122e607cc022-kube-api-access-7zxd2\") pod \"route-controller-manager-f4f549b66-c4dfz\" (UID: \"21c82feb-58d6-41b0-b9b3-122e607cc022\") " pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.247520 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-serving-cert\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.247554 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21c82feb-58d6-41b0-b9b3-122e607cc022-serving-cert\") pod \"route-controller-manager-f4f549b66-c4dfz\" (UID: \"21c82feb-58d6-41b0-b9b3-122e607cc022\") " pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.249317 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-client-ca\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.249548 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c82feb-58d6-41b0-b9b3-122e607cc022-config\") pod \"route-controller-manager-f4f549b66-c4dfz\" (UID: \"21c82feb-58d6-41b0-b9b3-122e607cc022\") " pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.249619 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21c82feb-58d6-41b0-b9b3-122e607cc022-client-ca\") pod \"route-controller-manager-f4f549b66-c4dfz\" (UID: \"21c82feb-58d6-41b0-b9b3-122e607cc022\") " pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.249758 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-proxy-ca-bundles\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.250095 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-config\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.256686 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21c82feb-58d6-41b0-b9b3-122e607cc022-serving-cert\") pod \"route-controller-manager-f4f549b66-c4dfz\" (UID: \"21c82feb-58d6-41b0-b9b3-122e607cc022\") " pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.270373 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zxd2\" (UniqueName: \"kubernetes.io/projected/21c82feb-58d6-41b0-b9b3-122e607cc022-kube-api-access-7zxd2\") pod \"route-controller-manager-f4f549b66-c4dfz\" (UID: \"21c82feb-58d6-41b0-b9b3-122e607cc022\") " pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.271747 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggjtt\" (UniqueName: \"kubernetes.io/projected/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-kube-api-access-ggjtt\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.271865 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f507cfd1-7d3c-4427-95b7-0ea58d3495fb-serving-cert\") pod \"controller-manager-5d8ff6b9b5-8cdp4\" (UID: \"f507cfd1-7d3c-4427-95b7-0ea58d3495fb\") " pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.383716 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.426114 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.686937 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4"] Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.843669 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz"] Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.988023 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" event={"ID":"21c82feb-58d6-41b0-b9b3-122e607cc022","Type":"ContainerStarted","Data":"d5e2925415467fd843cf8e6bf07c01a3c8884837ead7cf9fb9e5621f768a3a04"} Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.988170 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" event={"ID":"21c82feb-58d6-41b0-b9b3-122e607cc022","Type":"ContainerStarted","Data":"fd567f88b9feb8f78f3250b730502470ecfd772ba43a2678789478a26d755e41"} Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.989449 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.991038 4867 patch_prober.go:28] interesting pod/route-controller-manager-f4f549b66-c4dfz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.991082 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" podUID="21c82feb-58d6-41b0-b9b3-122e607cc022" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.991899 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" event={"ID":"f507cfd1-7d3c-4427-95b7-0ea58d3495fb","Type":"ContainerStarted","Data":"ce65b2adb5c1424e18c4e85922aa15ecd001b96975c6960f04d59c276ee3a3bc"} Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.991937 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" event={"ID":"f507cfd1-7d3c-4427-95b7-0ea58d3495fb","Type":"ContainerStarted","Data":"861c7b0e08d72d1e2808e1c6f3b3d45a4f318926ab3c255d701693800e759c6f"} Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.992715 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.995379 4867 generic.go:334] "Generic (PLEG): container finished" podID="4ab89d13-c239-4d47-aa11-68d1ea20e6b1" containerID="d5f269f6d7bce46e207031fbef92a581b08f93f81c6925deea13c5aeaa02d5e7" exitCode=0 Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.995428 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" event={"ID":"4ab89d13-c239-4d47-aa11-68d1ea20e6b1","Type":"ContainerDied","Data":"d5f269f6d7bce46e207031fbef92a581b08f93f81c6925deea13c5aeaa02d5e7"} Oct 06 13:16:06 crc kubenswrapper[4867]: I1006 13:16:06.997629 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" Oct 06 13:16:07 crc kubenswrapper[4867]: I1006 13:16:07.011129 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" podStartSLOduration=3.011108063 podStartE2EDuration="3.011108063s" podCreationTimestamp="2025-10-06 13:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:16:07.004697238 +0000 UTC m=+746.462645382" watchObservedRunningTime="2025-10-06 13:16:07.011108063 +0000 UTC m=+746.469056197" Oct 06 13:16:07 crc kubenswrapper[4867]: I1006 13:16:07.031245 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d8ff6b9b5-8cdp4" podStartSLOduration=3.031222881 podStartE2EDuration="3.031222881s" podCreationTimestamp="2025-10-06 13:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:16:07.027278313 +0000 UTC m=+746.485226457" watchObservedRunningTime="2025-10-06 13:16:07.031222881 +0000 UTC m=+746.489171025" Oct 06 13:16:07 crc kubenswrapper[4867]: I1006 13:16:07.229263 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bd3770c-b6a9-42a3-9530-30ef3b90c7ca" path="/var/lib/kubelet/pods/0bd3770c-b6a9-42a3-9530-30ef3b90c7ca/volumes" Oct 06 13:16:07 crc kubenswrapper[4867]: I1006 13:16:07.229821 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="674661a9-9e17-4d57-b887-8294a70fdcad" path="/var/lib/kubelet/pods/674661a9-9e17-4d57-b887-8294a70fdcad/volumes" Oct 06 13:16:08 crc kubenswrapper[4867]: I1006 13:16:08.005293 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f4f549b66-c4dfz" Oct 06 13:16:08 crc kubenswrapper[4867]: I1006 13:16:08.245296 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" Oct 06 13:16:08 crc kubenswrapper[4867]: I1006 13:16:08.279344 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-util\") pod \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\" (UID: \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\") " Oct 06 13:16:08 crc kubenswrapper[4867]: I1006 13:16:08.279433 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-bundle\") pod \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\" (UID: \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\") " Oct 06 13:16:08 crc kubenswrapper[4867]: I1006 13:16:08.279633 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg6cc\" (UniqueName: \"kubernetes.io/projected/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-kube-api-access-xg6cc\") pod \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\" (UID: \"4ab89d13-c239-4d47-aa11-68d1ea20e6b1\") " Oct 06 13:16:08 crc kubenswrapper[4867]: I1006 13:16:08.280626 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-bundle" (OuterVolumeSpecName: "bundle") pod "4ab89d13-c239-4d47-aa11-68d1ea20e6b1" (UID: "4ab89d13-c239-4d47-aa11-68d1ea20e6b1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:16:08 crc kubenswrapper[4867]: I1006 13:16:08.289400 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-kube-api-access-xg6cc" (OuterVolumeSpecName: "kube-api-access-xg6cc") pod "4ab89d13-c239-4d47-aa11-68d1ea20e6b1" (UID: "4ab89d13-c239-4d47-aa11-68d1ea20e6b1"). InnerVolumeSpecName "kube-api-access-xg6cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:16:08 crc kubenswrapper[4867]: I1006 13:16:08.294667 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-util" (OuterVolumeSpecName: "util") pod "4ab89d13-c239-4d47-aa11-68d1ea20e6b1" (UID: "4ab89d13-c239-4d47-aa11-68d1ea20e6b1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:16:08 crc kubenswrapper[4867]: I1006 13:16:08.381957 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg6cc\" (UniqueName: \"kubernetes.io/projected/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-kube-api-access-xg6cc\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:08 crc kubenswrapper[4867]: I1006 13:16:08.381991 4867 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-util\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:08 crc kubenswrapper[4867]: I1006 13:16:08.382000 4867 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ab89d13-c239-4d47-aa11-68d1ea20e6b1-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:09 crc kubenswrapper[4867]: I1006 13:16:09.020661 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" event={"ID":"4ab89d13-c239-4d47-aa11-68d1ea20e6b1","Type":"ContainerDied","Data":"a192d5a5b969d6795e8bb7b8aaf8fb79af5546baac4692ae60abd20b0dd8ba45"} Oct 06 13:16:09 crc kubenswrapper[4867]: I1006 13:16:09.020953 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a192d5a5b969d6795e8bb7b8aaf8fb79af5546baac4692ae60abd20b0dd8ba45" Oct 06 13:16:09 crc kubenswrapper[4867]: I1006 13:16:09.020810 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l" Oct 06 13:16:10 crc kubenswrapper[4867]: E1006 13:16:10.026840 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/NetworkManager-dispatcher.service\": RecentStats: unable to find data in memory cache]" Oct 06 13:16:13 crc kubenswrapper[4867]: I1006 13:16:13.570106 4867 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.032618 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5487d99769-x5czz"] Oct 06 13:16:18 crc kubenswrapper[4867]: E1006 13:16:18.033475 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab89d13-c239-4d47-aa11-68d1ea20e6b1" containerName="extract" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.033489 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab89d13-c239-4d47-aa11-68d1ea20e6b1" containerName="extract" Oct 06 13:16:18 crc kubenswrapper[4867]: E1006 13:16:18.033497 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab89d13-c239-4d47-aa11-68d1ea20e6b1" containerName="util" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.033503 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab89d13-c239-4d47-aa11-68d1ea20e6b1" containerName="util" Oct 06 13:16:18 crc kubenswrapper[4867]: E1006 13:16:18.033514 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab89d13-c239-4d47-aa11-68d1ea20e6b1" containerName="pull" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.033521 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab89d13-c239-4d47-aa11-68d1ea20e6b1" containerName="pull" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.033648 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab89d13-c239-4d47-aa11-68d1ea20e6b1" containerName="extract" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.034156 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.038314 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.038330 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.038450 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.039196 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.039398 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-dhd4z" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.047607 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5487d99769-x5czz"] Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.111559 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43844a7c-24fd-49b1-9860-6b4a63fc136a-apiservice-cert\") pod \"metallb-operator-controller-manager-5487d99769-x5czz\" (UID: \"43844a7c-24fd-49b1-9860-6b4a63fc136a\") " pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.111630 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43844a7c-24fd-49b1-9860-6b4a63fc136a-webhook-cert\") pod \"metallb-operator-controller-manager-5487d99769-x5czz\" (UID: \"43844a7c-24fd-49b1-9860-6b4a63fc136a\") " pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.111709 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hxr8\" (UniqueName: \"kubernetes.io/projected/43844a7c-24fd-49b1-9860-6b4a63fc136a-kube-api-access-9hxr8\") pod \"metallb-operator-controller-manager-5487d99769-x5czz\" (UID: \"43844a7c-24fd-49b1-9860-6b4a63fc136a\") " pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.213976 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43844a7c-24fd-49b1-9860-6b4a63fc136a-apiservice-cert\") pod \"metallb-operator-controller-manager-5487d99769-x5czz\" (UID: \"43844a7c-24fd-49b1-9860-6b4a63fc136a\") " pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.214025 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43844a7c-24fd-49b1-9860-6b4a63fc136a-webhook-cert\") pod \"metallb-operator-controller-manager-5487d99769-x5czz\" (UID: \"43844a7c-24fd-49b1-9860-6b4a63fc136a\") " pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.214163 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hxr8\" (UniqueName: \"kubernetes.io/projected/43844a7c-24fd-49b1-9860-6b4a63fc136a-kube-api-access-9hxr8\") pod \"metallb-operator-controller-manager-5487d99769-x5czz\" (UID: \"43844a7c-24fd-49b1-9860-6b4a63fc136a\") " pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.221374 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43844a7c-24fd-49b1-9860-6b4a63fc136a-webhook-cert\") pod \"metallb-operator-controller-manager-5487d99769-x5czz\" (UID: \"43844a7c-24fd-49b1-9860-6b4a63fc136a\") " pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.231220 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43844a7c-24fd-49b1-9860-6b4a63fc136a-apiservice-cert\") pod \"metallb-operator-controller-manager-5487d99769-x5czz\" (UID: \"43844a7c-24fd-49b1-9860-6b4a63fc136a\") " pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.233872 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hxr8\" (UniqueName: \"kubernetes.io/projected/43844a7c-24fd-49b1-9860-6b4a63fc136a-kube-api-access-9hxr8\") pod \"metallb-operator-controller-manager-5487d99769-x5czz\" (UID: \"43844a7c-24fd-49b1-9860-6b4a63fc136a\") " pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.352158 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.499530 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln"] Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.505517 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.517113 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-zsrpq" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.522693 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.522873 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln"] Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.523088 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.523086 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb30a785-833d-47ee-be7f-5235fbfc826c-webhook-cert\") pod \"metallb-operator-webhook-server-9d5469fbf-r6fln\" (UID: \"fb30a785-833d-47ee-be7f-5235fbfc826c\") " pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.523149 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh6bw\" (UniqueName: \"kubernetes.io/projected/fb30a785-833d-47ee-be7f-5235fbfc826c-kube-api-access-dh6bw\") pod \"metallb-operator-webhook-server-9d5469fbf-r6fln\" (UID: \"fb30a785-833d-47ee-be7f-5235fbfc826c\") " pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.523170 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb30a785-833d-47ee-be7f-5235fbfc826c-apiservice-cert\") pod \"metallb-operator-webhook-server-9d5469fbf-r6fln\" (UID: \"fb30a785-833d-47ee-be7f-5235fbfc826c\") " pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.624510 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb30a785-833d-47ee-be7f-5235fbfc826c-webhook-cert\") pod \"metallb-operator-webhook-server-9d5469fbf-r6fln\" (UID: \"fb30a785-833d-47ee-be7f-5235fbfc826c\") " pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.624571 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh6bw\" (UniqueName: \"kubernetes.io/projected/fb30a785-833d-47ee-be7f-5235fbfc826c-kube-api-access-dh6bw\") pod \"metallb-operator-webhook-server-9d5469fbf-r6fln\" (UID: \"fb30a785-833d-47ee-be7f-5235fbfc826c\") " pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.624601 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb30a785-833d-47ee-be7f-5235fbfc826c-apiservice-cert\") pod \"metallb-operator-webhook-server-9d5469fbf-r6fln\" (UID: \"fb30a785-833d-47ee-be7f-5235fbfc826c\") " pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.629833 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb30a785-833d-47ee-be7f-5235fbfc826c-webhook-cert\") pod \"metallb-operator-webhook-server-9d5469fbf-r6fln\" (UID: \"fb30a785-833d-47ee-be7f-5235fbfc826c\") " pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.629883 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb30a785-833d-47ee-be7f-5235fbfc826c-apiservice-cert\") pod \"metallb-operator-webhook-server-9d5469fbf-r6fln\" (UID: \"fb30a785-833d-47ee-be7f-5235fbfc826c\") " pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.655891 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh6bw\" (UniqueName: \"kubernetes.io/projected/fb30a785-833d-47ee-be7f-5235fbfc826c-kube-api-access-dh6bw\") pod \"metallb-operator-webhook-server-9d5469fbf-r6fln\" (UID: \"fb30a785-833d-47ee-be7f-5235fbfc826c\") " pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.816071 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5487d99769-x5czz"] Oct 06 13:16:18 crc kubenswrapper[4867]: I1006 13:16:18.834091 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" Oct 06 13:16:19 crc kubenswrapper[4867]: I1006 13:16:19.105960 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" event={"ID":"43844a7c-24fd-49b1-9860-6b4a63fc136a","Type":"ContainerStarted","Data":"e50a3eb122434f2cf0bceca4b5acfa47a71f6650c172ee3194b2065353c2c394"} Oct 06 13:16:19 crc kubenswrapper[4867]: I1006 13:16:19.347731 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln"] Oct 06 13:16:19 crc kubenswrapper[4867]: I1006 13:16:19.745652 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xhjkj"] Oct 06 13:16:19 crc kubenswrapper[4867]: I1006 13:16:19.747710 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:19 crc kubenswrapper[4867]: I1006 13:16:19.760210 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xhjkj"] Oct 06 13:16:19 crc kubenswrapper[4867]: I1006 13:16:19.945207 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1de8bdab-8be8-4faa-8423-da5d92ca8d51-utilities\") pod \"certified-operators-xhjkj\" (UID: \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\") " pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:19 crc kubenswrapper[4867]: I1006 13:16:19.945414 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1de8bdab-8be8-4faa-8423-da5d92ca8d51-catalog-content\") pod \"certified-operators-xhjkj\" (UID: \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\") " pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:19 crc kubenswrapper[4867]: I1006 13:16:19.945445 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwnln\" (UniqueName: \"kubernetes.io/projected/1de8bdab-8be8-4faa-8423-da5d92ca8d51-kube-api-access-jwnln\") pod \"certified-operators-xhjkj\" (UID: \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\") " pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:20 crc kubenswrapper[4867]: I1006 13:16:20.047027 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1de8bdab-8be8-4faa-8423-da5d92ca8d51-utilities\") pod \"certified-operators-xhjkj\" (UID: \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\") " pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:20 crc kubenswrapper[4867]: I1006 13:16:20.047130 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1de8bdab-8be8-4faa-8423-da5d92ca8d51-catalog-content\") pod \"certified-operators-xhjkj\" (UID: \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\") " pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:20 crc kubenswrapper[4867]: I1006 13:16:20.047172 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwnln\" (UniqueName: \"kubernetes.io/projected/1de8bdab-8be8-4faa-8423-da5d92ca8d51-kube-api-access-jwnln\") pod \"certified-operators-xhjkj\" (UID: \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\") " pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:20 crc kubenswrapper[4867]: I1006 13:16:20.047660 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1de8bdab-8be8-4faa-8423-da5d92ca8d51-utilities\") pod \"certified-operators-xhjkj\" (UID: \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\") " pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:20 crc kubenswrapper[4867]: I1006 13:16:20.047660 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1de8bdab-8be8-4faa-8423-da5d92ca8d51-catalog-content\") pod \"certified-operators-xhjkj\" (UID: \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\") " pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:20 crc kubenswrapper[4867]: I1006 13:16:20.102434 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwnln\" (UniqueName: \"kubernetes.io/projected/1de8bdab-8be8-4faa-8423-da5d92ca8d51-kube-api-access-jwnln\") pod \"certified-operators-xhjkj\" (UID: \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\") " pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:20 crc kubenswrapper[4867]: I1006 13:16:20.116167 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" event={"ID":"fb30a785-833d-47ee-be7f-5235fbfc826c","Type":"ContainerStarted","Data":"dfd52283982113b38f514288ae1b9d3ab0f05267a48dd67241a45d1a413aca6c"} Oct 06 13:16:20 crc kubenswrapper[4867]: I1006 13:16:20.119722 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:20 crc kubenswrapper[4867]: I1006 13:16:20.633172 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xhjkj"] Oct 06 13:16:21 crc kubenswrapper[4867]: I1006 13:16:21.134815 4867 generic.go:334] "Generic (PLEG): container finished" podID="1de8bdab-8be8-4faa-8423-da5d92ca8d51" containerID="0f5573f970e5883de001b36603b88c68514b03c99912a13590004a048806e9f8" exitCode=0 Oct 06 13:16:21 crc kubenswrapper[4867]: I1006 13:16:21.135120 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhjkj" event={"ID":"1de8bdab-8be8-4faa-8423-da5d92ca8d51","Type":"ContainerDied","Data":"0f5573f970e5883de001b36603b88c68514b03c99912a13590004a048806e9f8"} Oct 06 13:16:21 crc kubenswrapper[4867]: I1006 13:16:21.135153 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhjkj" event={"ID":"1de8bdab-8be8-4faa-8423-da5d92ca8d51","Type":"ContainerStarted","Data":"83c53a4bc4f5167c9479b9a47dbe1b9b3304aff70e1e272193776fa161e56837"} Oct 06 13:16:25 crc kubenswrapper[4867]: I1006 13:16:25.174931 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" event={"ID":"fb30a785-833d-47ee-be7f-5235fbfc826c","Type":"ContainerStarted","Data":"75a855256769dc4cb946a33c957b117c7807a9352c5bbbafc6bdda58f924528d"} Oct 06 13:16:25 crc kubenswrapper[4867]: I1006 13:16:25.175656 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" Oct 06 13:16:25 crc kubenswrapper[4867]: I1006 13:16:25.176228 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" event={"ID":"43844a7c-24fd-49b1-9860-6b4a63fc136a","Type":"ContainerStarted","Data":"909a572a2be0713632df481f7610df42453559db40c59413a4d1be082bdc1fd6"} Oct 06 13:16:25 crc kubenswrapper[4867]: I1006 13:16:25.176315 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" Oct 06 13:16:25 crc kubenswrapper[4867]: I1006 13:16:25.177995 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhjkj" event={"ID":"1de8bdab-8be8-4faa-8423-da5d92ca8d51","Type":"ContainerStarted","Data":"5d4b168f4790e4f21e5bde0b3c11e9ebcae056df901c42d05fcbb7d8b2dd8793"} Oct 06 13:16:25 crc kubenswrapper[4867]: I1006 13:16:25.199626 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" podStartSLOduration=1.598653283 podStartE2EDuration="7.199603119s" podCreationTimestamp="2025-10-06 13:16:18 +0000 UTC" firstStartedPulling="2025-10-06 13:16:19.358340649 +0000 UTC m=+758.816288793" lastFinishedPulling="2025-10-06 13:16:24.959290495 +0000 UTC m=+764.417238629" observedRunningTime="2025-10-06 13:16:25.197586794 +0000 UTC m=+764.655534948" watchObservedRunningTime="2025-10-06 13:16:25.199603119 +0000 UTC m=+764.657551263" Oct 06 13:16:25 crc kubenswrapper[4867]: I1006 13:16:25.227807 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" podStartSLOduration=1.107476249 podStartE2EDuration="7.227786666s" podCreationTimestamp="2025-10-06 13:16:18 +0000 UTC" firstStartedPulling="2025-10-06 13:16:18.825722377 +0000 UTC m=+758.283670551" lastFinishedPulling="2025-10-06 13:16:24.946032824 +0000 UTC m=+764.403980968" observedRunningTime="2025-10-06 13:16:25.224874807 +0000 UTC m=+764.682822951" watchObservedRunningTime="2025-10-06 13:16:25.227786666 +0000 UTC m=+764.685734820" Oct 06 13:16:26 crc kubenswrapper[4867]: I1006 13:16:26.185930 4867 generic.go:334] "Generic (PLEG): container finished" podID="1de8bdab-8be8-4faa-8423-da5d92ca8d51" containerID="5d4b168f4790e4f21e5bde0b3c11e9ebcae056df901c42d05fcbb7d8b2dd8793" exitCode=0 Oct 06 13:16:26 crc kubenswrapper[4867]: I1006 13:16:26.186421 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhjkj" event={"ID":"1de8bdab-8be8-4faa-8423-da5d92ca8d51","Type":"ContainerDied","Data":"5d4b168f4790e4f21e5bde0b3c11e9ebcae056df901c42d05fcbb7d8b2dd8793"} Oct 06 13:16:27 crc kubenswrapper[4867]: I1006 13:16:27.194836 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhjkj" event={"ID":"1de8bdab-8be8-4faa-8423-da5d92ca8d51","Type":"ContainerStarted","Data":"edaa63c7f1abd59b65320759be73f882c50ad061f41d16194467263470b290b0"} Oct 06 13:16:27 crc kubenswrapper[4867]: I1006 13:16:27.231621 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xhjkj" podStartSLOduration=2.6415259669999998 podStartE2EDuration="8.231602567s" podCreationTimestamp="2025-10-06 13:16:19 +0000 UTC" firstStartedPulling="2025-10-06 13:16:21.136913787 +0000 UTC m=+760.594861931" lastFinishedPulling="2025-10-06 13:16:26.726990387 +0000 UTC m=+766.184938531" observedRunningTime="2025-10-06 13:16:27.226609491 +0000 UTC m=+766.684557635" watchObservedRunningTime="2025-10-06 13:16:27.231602567 +0000 UTC m=+766.689550711" Oct 06 13:16:30 crc kubenswrapper[4867]: I1006 13:16:30.119975 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:30 crc kubenswrapper[4867]: I1006 13:16:30.121022 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:30 crc kubenswrapper[4867]: I1006 13:16:30.164787 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:36 crc kubenswrapper[4867]: I1006 13:16:36.338018 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gt755"] Oct 06 13:16:36 crc kubenswrapper[4867]: I1006 13:16:36.339808 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:36 crc kubenswrapper[4867]: I1006 13:16:36.364213 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gt755"] Oct 06 13:16:36 crc kubenswrapper[4867]: I1006 13:16:36.445723 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8052bdb-43d7-4f17-bede-bc70923ba08c-catalog-content\") pod \"community-operators-gt755\" (UID: \"a8052bdb-43d7-4f17-bede-bc70923ba08c\") " pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:36 crc kubenswrapper[4867]: I1006 13:16:36.445952 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8052bdb-43d7-4f17-bede-bc70923ba08c-utilities\") pod \"community-operators-gt755\" (UID: \"a8052bdb-43d7-4f17-bede-bc70923ba08c\") " pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:36 crc kubenswrapper[4867]: I1006 13:16:36.446119 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p99r\" (UniqueName: \"kubernetes.io/projected/a8052bdb-43d7-4f17-bede-bc70923ba08c-kube-api-access-2p99r\") pod \"community-operators-gt755\" (UID: \"a8052bdb-43d7-4f17-bede-bc70923ba08c\") " pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:36 crc kubenswrapper[4867]: I1006 13:16:36.547300 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8052bdb-43d7-4f17-bede-bc70923ba08c-catalog-content\") pod \"community-operators-gt755\" (UID: \"a8052bdb-43d7-4f17-bede-bc70923ba08c\") " pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:36 crc kubenswrapper[4867]: I1006 13:16:36.547396 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8052bdb-43d7-4f17-bede-bc70923ba08c-utilities\") pod \"community-operators-gt755\" (UID: \"a8052bdb-43d7-4f17-bede-bc70923ba08c\") " pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:36 crc kubenswrapper[4867]: I1006 13:16:36.547457 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p99r\" (UniqueName: \"kubernetes.io/projected/a8052bdb-43d7-4f17-bede-bc70923ba08c-kube-api-access-2p99r\") pod \"community-operators-gt755\" (UID: \"a8052bdb-43d7-4f17-bede-bc70923ba08c\") " pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:36 crc kubenswrapper[4867]: I1006 13:16:36.547997 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8052bdb-43d7-4f17-bede-bc70923ba08c-catalog-content\") pod \"community-operators-gt755\" (UID: \"a8052bdb-43d7-4f17-bede-bc70923ba08c\") " pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:36 crc kubenswrapper[4867]: I1006 13:16:36.548084 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8052bdb-43d7-4f17-bede-bc70923ba08c-utilities\") pod \"community-operators-gt755\" (UID: \"a8052bdb-43d7-4f17-bede-bc70923ba08c\") " pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:36 crc kubenswrapper[4867]: I1006 13:16:36.574704 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p99r\" (UniqueName: \"kubernetes.io/projected/a8052bdb-43d7-4f17-bede-bc70923ba08c-kube-api-access-2p99r\") pod \"community-operators-gt755\" (UID: \"a8052bdb-43d7-4f17-bede-bc70923ba08c\") " pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:36 crc kubenswrapper[4867]: I1006 13:16:36.661081 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:37 crc kubenswrapper[4867]: I1006 13:16:37.170031 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gt755"] Oct 06 13:16:37 crc kubenswrapper[4867]: I1006 13:16:37.264713 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gt755" event={"ID":"a8052bdb-43d7-4f17-bede-bc70923ba08c","Type":"ContainerStarted","Data":"2042602e9f97fcfbcf4dd5c1a78d755a15b7267241dca03c81517ad84fc29970"} Oct 06 13:16:38 crc kubenswrapper[4867]: I1006 13:16:38.272854 4867 generic.go:334] "Generic (PLEG): container finished" podID="a8052bdb-43d7-4f17-bede-bc70923ba08c" containerID="cad6926461adc0819879d90889555b6953c57ceb92de4531c45c2d5b4c09d09a" exitCode=0 Oct 06 13:16:38 crc kubenswrapper[4867]: I1006 13:16:38.272934 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gt755" event={"ID":"a8052bdb-43d7-4f17-bede-bc70923ba08c","Type":"ContainerDied","Data":"cad6926461adc0819879d90889555b6953c57ceb92de4531c45c2d5b4c09d09a"} Oct 06 13:16:38 crc kubenswrapper[4867]: I1006 13:16:38.842839 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-9d5469fbf-r6fln" Oct 06 13:16:39 crc kubenswrapper[4867]: I1006 13:16:39.283802 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gt755" event={"ID":"a8052bdb-43d7-4f17-bede-bc70923ba08c","Type":"ContainerStarted","Data":"0f4cd7229e5f88cee9dabdd30353bdb49adf3f01f67968bae9b6b344f8199abe"} Oct 06 13:16:40 crc kubenswrapper[4867]: I1006 13:16:40.181694 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:40 crc kubenswrapper[4867]: I1006 13:16:40.292909 4867 generic.go:334] "Generic (PLEG): container finished" podID="a8052bdb-43d7-4f17-bede-bc70923ba08c" containerID="0f4cd7229e5f88cee9dabdd30353bdb49adf3f01f67968bae9b6b344f8199abe" exitCode=0 Oct 06 13:16:40 crc kubenswrapper[4867]: I1006 13:16:40.292960 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gt755" event={"ID":"a8052bdb-43d7-4f17-bede-bc70923ba08c","Type":"ContainerDied","Data":"0f4cd7229e5f88cee9dabdd30353bdb49adf3f01f67968bae9b6b344f8199abe"} Oct 06 13:16:41 crc kubenswrapper[4867]: I1006 13:16:41.309360 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gt755" event={"ID":"a8052bdb-43d7-4f17-bede-bc70923ba08c","Type":"ContainerStarted","Data":"b3d803b9d3b253a91f3632cf9388638fa4774aa01c4bb6db94fa2606e906e1d1"} Oct 06 13:16:41 crc kubenswrapper[4867]: I1006 13:16:41.333603 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gt755" podStartSLOduration=2.820248528 podStartE2EDuration="5.333585422s" podCreationTimestamp="2025-10-06 13:16:36 +0000 UTC" firstStartedPulling="2025-10-06 13:16:38.275632509 +0000 UTC m=+777.733580643" lastFinishedPulling="2025-10-06 13:16:40.788969393 +0000 UTC m=+780.246917537" observedRunningTime="2025-10-06 13:16:41.326574081 +0000 UTC m=+780.784522215" watchObservedRunningTime="2025-10-06 13:16:41.333585422 +0000 UTC m=+780.791533566" Oct 06 13:16:41 crc kubenswrapper[4867]: I1006 13:16:41.476720 4867 scope.go:117] "RemoveContainer" containerID="f23823ef72bc471ab9f68c0ba8f8a908d55659711ccb2cca38713ef9c8d50c99" Oct 06 13:16:42 crc kubenswrapper[4867]: I1006 13:16:42.873229 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:16:42 crc kubenswrapper[4867]: I1006 13:16:42.873353 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:16:43 crc kubenswrapper[4867]: I1006 13:16:43.728346 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xhjkj"] Oct 06 13:16:43 crc kubenswrapper[4867]: I1006 13:16:43.728902 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xhjkj" podUID="1de8bdab-8be8-4faa-8423-da5d92ca8d51" containerName="registry-server" containerID="cri-o://edaa63c7f1abd59b65320759be73f882c50ad061f41d16194467263470b290b0" gracePeriod=2 Oct 06 13:16:44 crc kubenswrapper[4867]: I1006 13:16:44.334001 4867 generic.go:334] "Generic (PLEG): container finished" podID="1de8bdab-8be8-4faa-8423-da5d92ca8d51" containerID="edaa63c7f1abd59b65320759be73f882c50ad061f41d16194467263470b290b0" exitCode=0 Oct 06 13:16:44 crc kubenswrapper[4867]: I1006 13:16:44.334057 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhjkj" event={"ID":"1de8bdab-8be8-4faa-8423-da5d92ca8d51","Type":"ContainerDied","Data":"edaa63c7f1abd59b65320759be73f882c50ad061f41d16194467263470b290b0"} Oct 06 13:16:44 crc kubenswrapper[4867]: I1006 13:16:44.695679 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:44 crc kubenswrapper[4867]: I1006 13:16:44.777850 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwnln\" (UniqueName: \"kubernetes.io/projected/1de8bdab-8be8-4faa-8423-da5d92ca8d51-kube-api-access-jwnln\") pod \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\" (UID: \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\") " Oct 06 13:16:44 crc kubenswrapper[4867]: I1006 13:16:44.778776 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1de8bdab-8be8-4faa-8423-da5d92ca8d51-catalog-content\") pod \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\" (UID: \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\") " Oct 06 13:16:44 crc kubenswrapper[4867]: I1006 13:16:44.778849 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1de8bdab-8be8-4faa-8423-da5d92ca8d51-utilities\") pod \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\" (UID: \"1de8bdab-8be8-4faa-8423-da5d92ca8d51\") " Oct 06 13:16:44 crc kubenswrapper[4867]: I1006 13:16:44.779631 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1de8bdab-8be8-4faa-8423-da5d92ca8d51-utilities" (OuterVolumeSpecName: "utilities") pod "1de8bdab-8be8-4faa-8423-da5d92ca8d51" (UID: "1de8bdab-8be8-4faa-8423-da5d92ca8d51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:16:44 crc kubenswrapper[4867]: I1006 13:16:44.779768 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1de8bdab-8be8-4faa-8423-da5d92ca8d51-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:44 crc kubenswrapper[4867]: I1006 13:16:44.787289 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1de8bdab-8be8-4faa-8423-da5d92ca8d51-kube-api-access-jwnln" (OuterVolumeSpecName: "kube-api-access-jwnln") pod "1de8bdab-8be8-4faa-8423-da5d92ca8d51" (UID: "1de8bdab-8be8-4faa-8423-da5d92ca8d51"). InnerVolumeSpecName "kube-api-access-jwnln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:16:44 crc kubenswrapper[4867]: I1006 13:16:44.832591 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1de8bdab-8be8-4faa-8423-da5d92ca8d51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1de8bdab-8be8-4faa-8423-da5d92ca8d51" (UID: "1de8bdab-8be8-4faa-8423-da5d92ca8d51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:16:44 crc kubenswrapper[4867]: I1006 13:16:44.881474 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwnln\" (UniqueName: \"kubernetes.io/projected/1de8bdab-8be8-4faa-8423-da5d92ca8d51-kube-api-access-jwnln\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:44 crc kubenswrapper[4867]: I1006 13:16:44.881545 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1de8bdab-8be8-4faa-8423-da5d92ca8d51-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:45 crc kubenswrapper[4867]: I1006 13:16:45.345017 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xhjkj" event={"ID":"1de8bdab-8be8-4faa-8423-da5d92ca8d51","Type":"ContainerDied","Data":"83c53a4bc4f5167c9479b9a47dbe1b9b3304aff70e1e272193776fa161e56837"} Oct 06 13:16:45 crc kubenswrapper[4867]: I1006 13:16:45.345102 4867 scope.go:117] "RemoveContainer" containerID="edaa63c7f1abd59b65320759be73f882c50ad061f41d16194467263470b290b0" Oct 06 13:16:45 crc kubenswrapper[4867]: I1006 13:16:45.345108 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xhjkj" Oct 06 13:16:45 crc kubenswrapper[4867]: I1006 13:16:45.367023 4867 scope.go:117] "RemoveContainer" containerID="5d4b168f4790e4f21e5bde0b3c11e9ebcae056df901c42d05fcbb7d8b2dd8793" Oct 06 13:16:45 crc kubenswrapper[4867]: I1006 13:16:45.369365 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xhjkj"] Oct 06 13:16:45 crc kubenswrapper[4867]: I1006 13:16:45.374210 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xhjkj"] Oct 06 13:16:45 crc kubenswrapper[4867]: I1006 13:16:45.389059 4867 scope.go:117] "RemoveContainer" containerID="0f5573f970e5883de001b36603b88c68514b03c99912a13590004a048806e9f8" Oct 06 13:16:46 crc kubenswrapper[4867]: I1006 13:16:46.662064 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:46 crc kubenswrapper[4867]: I1006 13:16:46.662143 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:46 crc kubenswrapper[4867]: I1006 13:16:46.714961 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:47 crc kubenswrapper[4867]: I1006 13:16:47.229207 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1de8bdab-8be8-4faa-8423-da5d92ca8d51" path="/var/lib/kubelet/pods/1de8bdab-8be8-4faa-8423-da5d92ca8d51/volumes" Oct 06 13:16:47 crc kubenswrapper[4867]: I1006 13:16:47.396956 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:50 crc kubenswrapper[4867]: I1006 13:16:50.125119 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gt755"] Oct 06 13:16:50 crc kubenswrapper[4867]: I1006 13:16:50.125647 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gt755" podUID="a8052bdb-43d7-4f17-bede-bc70923ba08c" containerName="registry-server" containerID="cri-o://b3d803b9d3b253a91f3632cf9388638fa4774aa01c4bb6db94fa2606e906e1d1" gracePeriod=2 Oct 06 13:16:50 crc kubenswrapper[4867]: I1006 13:16:50.422924 4867 generic.go:334] "Generic (PLEG): container finished" podID="a8052bdb-43d7-4f17-bede-bc70923ba08c" containerID="b3d803b9d3b253a91f3632cf9388638fa4774aa01c4bb6db94fa2606e906e1d1" exitCode=0 Oct 06 13:16:50 crc kubenswrapper[4867]: I1006 13:16:50.422988 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gt755" event={"ID":"a8052bdb-43d7-4f17-bede-bc70923ba08c","Type":"ContainerDied","Data":"b3d803b9d3b253a91f3632cf9388638fa4774aa01c4bb6db94fa2606e906e1d1"} Oct 06 13:16:50 crc kubenswrapper[4867]: I1006 13:16:50.605719 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:50 crc kubenswrapper[4867]: I1006 13:16:50.681190 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p99r\" (UniqueName: \"kubernetes.io/projected/a8052bdb-43d7-4f17-bede-bc70923ba08c-kube-api-access-2p99r\") pod \"a8052bdb-43d7-4f17-bede-bc70923ba08c\" (UID: \"a8052bdb-43d7-4f17-bede-bc70923ba08c\") " Oct 06 13:16:50 crc kubenswrapper[4867]: I1006 13:16:50.681415 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8052bdb-43d7-4f17-bede-bc70923ba08c-catalog-content\") pod \"a8052bdb-43d7-4f17-bede-bc70923ba08c\" (UID: \"a8052bdb-43d7-4f17-bede-bc70923ba08c\") " Oct 06 13:16:50 crc kubenswrapper[4867]: I1006 13:16:50.681462 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8052bdb-43d7-4f17-bede-bc70923ba08c-utilities\") pod \"a8052bdb-43d7-4f17-bede-bc70923ba08c\" (UID: \"a8052bdb-43d7-4f17-bede-bc70923ba08c\") " Oct 06 13:16:50 crc kubenswrapper[4867]: I1006 13:16:50.682588 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8052bdb-43d7-4f17-bede-bc70923ba08c-utilities" (OuterVolumeSpecName: "utilities") pod "a8052bdb-43d7-4f17-bede-bc70923ba08c" (UID: "a8052bdb-43d7-4f17-bede-bc70923ba08c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:16:50 crc kubenswrapper[4867]: I1006 13:16:50.689342 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8052bdb-43d7-4f17-bede-bc70923ba08c-kube-api-access-2p99r" (OuterVolumeSpecName: "kube-api-access-2p99r") pod "a8052bdb-43d7-4f17-bede-bc70923ba08c" (UID: "a8052bdb-43d7-4f17-bede-bc70923ba08c"). InnerVolumeSpecName "kube-api-access-2p99r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:16:50 crc kubenswrapper[4867]: I1006 13:16:50.730602 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8052bdb-43d7-4f17-bede-bc70923ba08c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8052bdb-43d7-4f17-bede-bc70923ba08c" (UID: "a8052bdb-43d7-4f17-bede-bc70923ba08c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:16:50 crc kubenswrapper[4867]: I1006 13:16:50.783410 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8052bdb-43d7-4f17-bede-bc70923ba08c-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:50 crc kubenswrapper[4867]: I1006 13:16:50.783459 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p99r\" (UniqueName: \"kubernetes.io/projected/a8052bdb-43d7-4f17-bede-bc70923ba08c-kube-api-access-2p99r\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:50 crc kubenswrapper[4867]: I1006 13:16:50.783472 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8052bdb-43d7-4f17-bede-bc70923ba08c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:51 crc kubenswrapper[4867]: I1006 13:16:51.432586 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gt755" event={"ID":"a8052bdb-43d7-4f17-bede-bc70923ba08c","Type":"ContainerDied","Data":"2042602e9f97fcfbcf4dd5c1a78d755a15b7267241dca03c81517ad84fc29970"} Oct 06 13:16:51 crc kubenswrapper[4867]: I1006 13:16:51.432650 4867 scope.go:117] "RemoveContainer" containerID="b3d803b9d3b253a91f3632cf9388638fa4774aa01c4bb6db94fa2606e906e1d1" Oct 06 13:16:51 crc kubenswrapper[4867]: I1006 13:16:51.432686 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gt755" Oct 06 13:16:51 crc kubenswrapper[4867]: I1006 13:16:51.451933 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gt755"] Oct 06 13:16:51 crc kubenswrapper[4867]: I1006 13:16:51.456031 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gt755"] Oct 06 13:16:51 crc kubenswrapper[4867]: I1006 13:16:51.456918 4867 scope.go:117] "RemoveContainer" containerID="0f4cd7229e5f88cee9dabdd30353bdb49adf3f01f67968bae9b6b344f8199abe" Oct 06 13:16:51 crc kubenswrapper[4867]: I1006 13:16:51.471897 4867 scope.go:117] "RemoveContainer" containerID="cad6926461adc0819879d90889555b6953c57ceb92de4531c45c2d5b4c09d09a" Oct 06 13:16:53 crc kubenswrapper[4867]: I1006 13:16:53.230844 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8052bdb-43d7-4f17-bede-bc70923ba08c" path="/var/lib/kubelet/pods/a8052bdb-43d7-4f17-bede-bc70923ba08c/volumes" Oct 06 13:16:58 crc kubenswrapper[4867]: I1006 13:16:58.355640 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5487d99769-x5czz" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.061417 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-8wgsb"] Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.061757 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de8bdab-8be8-4faa-8423-da5d92ca8d51" containerName="extract-utilities" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.061783 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de8bdab-8be8-4faa-8423-da5d92ca8d51" containerName="extract-utilities" Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.061801 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8052bdb-43d7-4f17-bede-bc70923ba08c" containerName="extract-utilities" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.061810 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8052bdb-43d7-4f17-bede-bc70923ba08c" containerName="extract-utilities" Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.061825 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de8bdab-8be8-4faa-8423-da5d92ca8d51" containerName="registry-server" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.061833 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de8bdab-8be8-4faa-8423-da5d92ca8d51" containerName="registry-server" Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.061847 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de8bdab-8be8-4faa-8423-da5d92ca8d51" containerName="extract-content" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.061854 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de8bdab-8be8-4faa-8423-da5d92ca8d51" containerName="extract-content" Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.061869 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8052bdb-43d7-4f17-bede-bc70923ba08c" containerName="registry-server" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.061876 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8052bdb-43d7-4f17-bede-bc70923ba08c" containerName="registry-server" Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.061887 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8052bdb-43d7-4f17-bede-bc70923ba08c" containerName="extract-content" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.061894 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8052bdb-43d7-4f17-bede-bc70923ba08c" containerName="extract-content" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.062050 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1de8bdab-8be8-4faa-8423-da5d92ca8d51" containerName="registry-server" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.062066 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8052bdb-43d7-4f17-bede-bc70923ba08c" containerName="registry-server" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.067605 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.069135 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52"] Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.070992 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-l5ppz" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.071221 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.071363 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.075549 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.078078 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.088334 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52"] Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.171980 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-xn6jt"] Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.173352 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-xn6jt" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.176398 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.178981 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.179873 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.180611 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-s8dsp" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.186574 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-qs5gj"] Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.187780 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-qs5gj" Oct 06 13:16:59 crc kubenswrapper[4867]: W1006 13:16:59.189500 4867 reflector.go:561] object-"metallb-system"/"controller-certs-secret": failed to list *v1.Secret: secrets "controller-certs-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.189549 4867 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"controller-certs-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"controller-certs-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.202208 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e88a96ad-49db-4ffa-b274-9160056bb4c9-metrics-certs\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.202468 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e88a96ad-49db-4ffa-b274-9160056bb4c9-frr-conf\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.202607 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq8gn\" (UniqueName: \"kubernetes.io/projected/e88a96ad-49db-4ffa-b274-9160056bb4c9-kube-api-access-fq8gn\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.202734 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77pkl\" (UniqueName: \"kubernetes.io/projected/bab9da54-1204-49fd-af69-b48a1542d2e7-kube-api-access-77pkl\") pod \"frr-k8s-webhook-server-64bf5d555-xdc52\" (UID: \"bab9da54-1204-49fd-af69-b48a1542d2e7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.203486 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e88a96ad-49db-4ffa-b274-9160056bb4c9-frr-sockets\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.203556 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bab9da54-1204-49fd-af69-b48a1542d2e7-cert\") pod \"frr-k8s-webhook-server-64bf5d555-xdc52\" (UID: \"bab9da54-1204-49fd-af69-b48a1542d2e7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.203598 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e88a96ad-49db-4ffa-b274-9160056bb4c9-metrics\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.203660 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e88a96ad-49db-4ffa-b274-9160056bb4c9-reloader\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.203708 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e88a96ad-49db-4ffa-b274-9160056bb4c9-frr-startup\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.206809 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-qs5gj"] Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305180 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e88a96ad-49db-4ffa-b274-9160056bb4c9-frr-sockets\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305246 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bab9da54-1204-49fd-af69-b48a1542d2e7-cert\") pod \"frr-k8s-webhook-server-64bf5d555-xdc52\" (UID: \"bab9da54-1204-49fd-af69-b48a1542d2e7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305290 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e88a96ad-49db-4ffa-b274-9160056bb4c9-metrics\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305332 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-memberlist\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305362 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e88a96ad-49db-4ffa-b274-9160056bb4c9-reloader\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305398 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e88a96ad-49db-4ffa-b274-9160056bb4c9-frr-startup\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305430 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e88a96ad-49db-4ffa-b274-9160056bb4c9-metrics-certs\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305455 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e88a96ad-49db-4ffa-b274-9160056bb4c9-frr-conf\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305482 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-metallb-excludel2\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305510 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq8gn\" (UniqueName: \"kubernetes.io/projected/e88a96ad-49db-4ffa-b274-9160056bb4c9-kube-api-access-fq8gn\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305545 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vflgd\" (UniqueName: \"kubernetes.io/projected/39dc72e9-c1d5-4257-b8ea-248aaed554e5-kube-api-access-vflgd\") pod \"controller-68d546b9d8-qs5gj\" (UID: \"39dc72e9-c1d5-4257-b8ea-248aaed554e5\") " pod="metallb-system/controller-68d546b9d8-qs5gj" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305568 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx8k5\" (UniqueName: \"kubernetes.io/projected/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-kube-api-access-lx8k5\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305597 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77pkl\" (UniqueName: \"kubernetes.io/projected/bab9da54-1204-49fd-af69-b48a1542d2e7-kube-api-access-77pkl\") pod \"frr-k8s-webhook-server-64bf5d555-xdc52\" (UID: \"bab9da54-1204-49fd-af69-b48a1542d2e7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305634 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-metrics-certs\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305660 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39dc72e9-c1d5-4257-b8ea-248aaed554e5-metrics-certs\") pod \"controller-68d546b9d8-qs5gj\" (UID: \"39dc72e9-c1d5-4257-b8ea-248aaed554e5\") " pod="metallb-system/controller-68d546b9d8-qs5gj" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305684 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39dc72e9-c1d5-4257-b8ea-248aaed554e5-cert\") pod \"controller-68d546b9d8-qs5gj\" (UID: \"39dc72e9-c1d5-4257-b8ea-248aaed554e5\") " pod="metallb-system/controller-68d546b9d8-qs5gj" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305694 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e88a96ad-49db-4ffa-b274-9160056bb4c9-frr-sockets\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.305935 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e88a96ad-49db-4ffa-b274-9160056bb4c9-metrics\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.306348 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e88a96ad-49db-4ffa-b274-9160056bb4c9-frr-conf\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.306498 4867 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.306549 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e88a96ad-49db-4ffa-b274-9160056bb4c9-reloader\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.306602 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e88a96ad-49db-4ffa-b274-9160056bb4c9-metrics-certs podName:e88a96ad-49db-4ffa-b274-9160056bb4c9 nodeName:}" failed. No retries permitted until 2025-10-06 13:16:59.80657547 +0000 UTC m=+799.264523624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e88a96ad-49db-4ffa-b274-9160056bb4c9-metrics-certs") pod "frr-k8s-8wgsb" (UID: "e88a96ad-49db-4ffa-b274-9160056bb4c9") : secret "frr-k8s-certs-secret" not found Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.306790 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e88a96ad-49db-4ffa-b274-9160056bb4c9-frr-startup\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.324978 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bab9da54-1204-49fd-af69-b48a1542d2e7-cert\") pod \"frr-k8s-webhook-server-64bf5d555-xdc52\" (UID: \"bab9da54-1204-49fd-af69-b48a1542d2e7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.330430 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77pkl\" (UniqueName: \"kubernetes.io/projected/bab9da54-1204-49fd-af69-b48a1542d2e7-kube-api-access-77pkl\") pod \"frr-k8s-webhook-server-64bf5d555-xdc52\" (UID: \"bab9da54-1204-49fd-af69-b48a1542d2e7\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.332668 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq8gn\" (UniqueName: \"kubernetes.io/projected/e88a96ad-49db-4ffa-b274-9160056bb4c9-kube-api-access-fq8gn\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.405764 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.408566 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-metallb-excludel2\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.408711 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vflgd\" (UniqueName: \"kubernetes.io/projected/39dc72e9-c1d5-4257-b8ea-248aaed554e5-kube-api-access-vflgd\") pod \"controller-68d546b9d8-qs5gj\" (UID: \"39dc72e9-c1d5-4257-b8ea-248aaed554e5\") " pod="metallb-system/controller-68d546b9d8-qs5gj" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.408735 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx8k5\" (UniqueName: \"kubernetes.io/projected/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-kube-api-access-lx8k5\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.408832 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-metrics-certs\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.408879 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39dc72e9-c1d5-4257-b8ea-248aaed554e5-metrics-certs\") pod \"controller-68d546b9d8-qs5gj\" (UID: \"39dc72e9-c1d5-4257-b8ea-248aaed554e5\") " pod="metallb-system/controller-68d546b9d8-qs5gj" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.408918 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39dc72e9-c1d5-4257-b8ea-248aaed554e5-cert\") pod \"controller-68d546b9d8-qs5gj\" (UID: \"39dc72e9-c1d5-4257-b8ea-248aaed554e5\") " pod="metallb-system/controller-68d546b9d8-qs5gj" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.409126 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-memberlist\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.409308 4867 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.409378 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-metrics-certs podName:e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb nodeName:}" failed. No retries permitted until 2025-10-06 13:16:59.909352585 +0000 UTC m=+799.367300719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-metrics-certs") pod "speaker-xn6jt" (UID: "e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb") : secret "speaker-certs-secret" not found Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.409401 4867 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.409464 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-memberlist podName:e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb nodeName:}" failed. No retries permitted until 2025-10-06 13:16:59.909442787 +0000 UTC m=+799.367390931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-memberlist") pod "speaker-xn6jt" (UID: "e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb") : secret "metallb-memberlist" not found Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.409527 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-metallb-excludel2\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.411868 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.426262 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39dc72e9-c1d5-4257-b8ea-248aaed554e5-cert\") pod \"controller-68d546b9d8-qs5gj\" (UID: \"39dc72e9-c1d5-4257-b8ea-248aaed554e5\") " pod="metallb-system/controller-68d546b9d8-qs5gj" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.431227 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx8k5\" (UniqueName: \"kubernetes.io/projected/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-kube-api-access-lx8k5\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.433525 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vflgd\" (UniqueName: \"kubernetes.io/projected/39dc72e9-c1d5-4257-b8ea-248aaed554e5-kube-api-access-vflgd\") pod \"controller-68d546b9d8-qs5gj\" (UID: \"39dc72e9-c1d5-4257-b8ea-248aaed554e5\") " pod="metallb-system/controller-68d546b9d8-qs5gj" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.815893 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e88a96ad-49db-4ffa-b274-9160056bb4c9-metrics-certs\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.819537 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e88a96ad-49db-4ffa-b274-9160056bb4c9-metrics-certs\") pod \"frr-k8s-8wgsb\" (UID: \"e88a96ad-49db-4ffa-b274-9160056bb4c9\") " pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.844968 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52"] Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.918271 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-metrics-certs\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.918388 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-memberlist\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.918554 4867 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 13:16:59 crc kubenswrapper[4867]: E1006 13:16:59.918621 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-memberlist podName:e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb nodeName:}" failed. No retries permitted until 2025-10-06 13:17:00.918602603 +0000 UTC m=+800.376550747 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-memberlist") pod "speaker-xn6jt" (UID: "e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb") : secret "metallb-memberlist" not found Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.921335 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-metrics-certs\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:16:59 crc kubenswrapper[4867]: I1006 13:16:59.990595 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:17:00 crc kubenswrapper[4867]: E1006 13:17:00.409891 4867 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: failed to sync secret cache: timed out waiting for the condition Oct 06 13:17:00 crc kubenswrapper[4867]: E1006 13:17:00.409994 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39dc72e9-c1d5-4257-b8ea-248aaed554e5-metrics-certs podName:39dc72e9-c1d5-4257-b8ea-248aaed554e5 nodeName:}" failed. No retries permitted until 2025-10-06 13:17:00.909970433 +0000 UTC m=+800.367918577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39dc72e9-c1d5-4257-b8ea-248aaed554e5-metrics-certs") pod "controller-68d546b9d8-qs5gj" (UID: "39dc72e9-c1d5-4257-b8ea-248aaed554e5") : failed to sync secret cache: timed out waiting for the condition Oct 06 13:17:00 crc kubenswrapper[4867]: I1006 13:17:00.460095 4867 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 06 13:17:00 crc kubenswrapper[4867]: I1006 13:17:00.489640 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52" event={"ID":"bab9da54-1204-49fd-af69-b48a1542d2e7","Type":"ContainerStarted","Data":"676af92677f2b68c36730c8d751ce86a23d451a2c882d766deb74370c04ea57f"} Oct 06 13:17:00 crc kubenswrapper[4867]: I1006 13:17:00.491155 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8wgsb" event={"ID":"e88a96ad-49db-4ffa-b274-9160056bb4c9","Type":"ContainerStarted","Data":"bc8f62483142871edc7ba72b0a1736af9d1beb5a51af7c4d4c95bf1579daed96"} Oct 06 13:17:00 crc kubenswrapper[4867]: I1006 13:17:00.933482 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39dc72e9-c1d5-4257-b8ea-248aaed554e5-metrics-certs\") pod \"controller-68d546b9d8-qs5gj\" (UID: \"39dc72e9-c1d5-4257-b8ea-248aaed554e5\") " pod="metallb-system/controller-68d546b9d8-qs5gj" Oct 06 13:17:00 crc kubenswrapper[4867]: I1006 13:17:00.933878 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-memberlist\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:17:00 crc kubenswrapper[4867]: E1006 13:17:00.934122 4867 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 13:17:00 crc kubenswrapper[4867]: E1006 13:17:00.934225 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-memberlist podName:e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb nodeName:}" failed. No retries permitted until 2025-10-06 13:17:02.93420077 +0000 UTC m=+802.392148914 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-memberlist") pod "speaker-xn6jt" (UID: "e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb") : secret "metallb-memberlist" not found Oct 06 13:17:00 crc kubenswrapper[4867]: I1006 13:17:00.942505 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39dc72e9-c1d5-4257-b8ea-248aaed554e5-metrics-certs\") pod \"controller-68d546b9d8-qs5gj\" (UID: \"39dc72e9-c1d5-4257-b8ea-248aaed554e5\") " pod="metallb-system/controller-68d546b9d8-qs5gj" Oct 06 13:17:01 crc kubenswrapper[4867]: I1006 13:17:01.003784 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-qs5gj" Oct 06 13:17:01 crc kubenswrapper[4867]: I1006 13:17:01.410886 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-qs5gj"] Oct 06 13:17:01 crc kubenswrapper[4867]: W1006 13:17:01.414926 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39dc72e9_c1d5_4257_b8ea_248aaed554e5.slice/crio-9d7823a4676d2601f8df100726156792fdd535d84f05062afb9272a2626280bd WatchSource:0}: Error finding container 9d7823a4676d2601f8df100726156792fdd535d84f05062afb9272a2626280bd: Status 404 returned error can't find the container with id 9d7823a4676d2601f8df100726156792fdd535d84f05062afb9272a2626280bd Oct 06 13:17:01 crc kubenswrapper[4867]: I1006 13:17:01.499155 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-qs5gj" event={"ID":"39dc72e9-c1d5-4257-b8ea-248aaed554e5","Type":"ContainerStarted","Data":"9d7823a4676d2601f8df100726156792fdd535d84f05062afb9272a2626280bd"} Oct 06 13:17:02 crc kubenswrapper[4867]: I1006 13:17:02.527996 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-qs5gj" event={"ID":"39dc72e9-c1d5-4257-b8ea-248aaed554e5","Type":"ContainerStarted","Data":"f97cdcf5ed5695f94ea6418317a6b9f3dec5c4e0a4ba76f86613f1b82283c857"} Oct 06 13:17:02 crc kubenswrapper[4867]: I1006 13:17:02.528396 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-qs5gj" Oct 06 13:17:02 crc kubenswrapper[4867]: I1006 13:17:02.528411 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-qs5gj" event={"ID":"39dc72e9-c1d5-4257-b8ea-248aaed554e5","Type":"ContainerStarted","Data":"97e63fdc6da79732955240eceb0f29f3f39b9a87e81bd5cd6ebcc83d12371c95"} Oct 06 13:17:02 crc kubenswrapper[4867]: I1006 13:17:02.967762 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-memberlist\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:17:02 crc kubenswrapper[4867]: I1006 13:17:02.976629 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb-memberlist\") pod \"speaker-xn6jt\" (UID: \"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb\") " pod="metallb-system/speaker-xn6jt" Oct 06 13:17:03 crc kubenswrapper[4867]: I1006 13:17:03.093169 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-xn6jt" Oct 06 13:17:03 crc kubenswrapper[4867]: W1006 13:17:03.126942 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode62a5b4f_ce8b_4681_a7fd_2453cd04e1cb.slice/crio-c2e5fd286cab67e5f5db09d84fe8a4869cddb971e71f8b4a949271c430939e4d WatchSource:0}: Error finding container c2e5fd286cab67e5f5db09d84fe8a4869cddb971e71f8b4a949271c430939e4d: Status 404 returned error can't find the container with id c2e5fd286cab67e5f5db09d84fe8a4869cddb971e71f8b4a949271c430939e4d Oct 06 13:17:03 crc kubenswrapper[4867]: I1006 13:17:03.539720 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xn6jt" event={"ID":"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb","Type":"ContainerStarted","Data":"e811a19a07fe2fd40d44ba0223eaf8cf4067725986c67d02e1a613ce6b756790"} Oct 06 13:17:03 crc kubenswrapper[4867]: I1006 13:17:03.539780 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xn6jt" event={"ID":"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb","Type":"ContainerStarted","Data":"c2e5fd286cab67e5f5db09d84fe8a4869cddb971e71f8b4a949271c430939e4d"} Oct 06 13:17:04 crc kubenswrapper[4867]: I1006 13:17:04.556239 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xn6jt" event={"ID":"e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb","Type":"ContainerStarted","Data":"9f08028af7e6d4ba60ec4a36efd104f73e12cf4c3271fca9776f03747b331ddc"} Oct 06 13:17:04 crc kubenswrapper[4867]: I1006 13:17:04.556677 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-xn6jt" Oct 06 13:17:04 crc kubenswrapper[4867]: I1006 13:17:04.587954 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-xn6jt" podStartSLOduration=5.587933154 podStartE2EDuration="5.587933154s" podCreationTimestamp="2025-10-06 13:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:17:04.586927526 +0000 UTC m=+804.044875670" watchObservedRunningTime="2025-10-06 13:17:04.587933154 +0000 UTC m=+804.045881298" Oct 06 13:17:04 crc kubenswrapper[4867]: I1006 13:17:04.589516 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-qs5gj" podStartSLOduration=5.589512297 podStartE2EDuration="5.589512297s" podCreationTimestamp="2025-10-06 13:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:17:02.555887527 +0000 UTC m=+802.013835671" watchObservedRunningTime="2025-10-06 13:17:04.589512297 +0000 UTC m=+804.047460441" Oct 06 13:17:09 crc kubenswrapper[4867]: I1006 13:17:09.589046 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52" event={"ID":"bab9da54-1204-49fd-af69-b48a1542d2e7","Type":"ContainerStarted","Data":"b3184ac0c98fb0debec7870ce0f5c248d82a8821ff40085a21ba8540448b0131"} Oct 06 13:17:09 crc kubenswrapper[4867]: I1006 13:17:09.589387 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52" Oct 06 13:17:09 crc kubenswrapper[4867]: I1006 13:17:09.590516 4867 generic.go:334] "Generic (PLEG): container finished" podID="e88a96ad-49db-4ffa-b274-9160056bb4c9" containerID="d15f5ff297dc73318138beb967ba70e09bf3a52ae7981abba06ffa88b2a97d60" exitCode=0 Oct 06 13:17:09 crc kubenswrapper[4867]: I1006 13:17:09.590542 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8wgsb" event={"ID":"e88a96ad-49db-4ffa-b274-9160056bb4c9","Type":"ContainerDied","Data":"d15f5ff297dc73318138beb967ba70e09bf3a52ae7981abba06ffa88b2a97d60"} Oct 06 13:17:09 crc kubenswrapper[4867]: I1006 13:17:09.620870 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52" podStartSLOduration=2.003542011 podStartE2EDuration="10.620848687s" podCreationTimestamp="2025-10-06 13:16:59 +0000 UTC" firstStartedPulling="2025-10-06 13:16:59.853700551 +0000 UTC m=+799.311648695" lastFinishedPulling="2025-10-06 13:17:08.471007237 +0000 UTC m=+807.928955371" observedRunningTime="2025-10-06 13:17:09.618929754 +0000 UTC m=+809.076877908" watchObservedRunningTime="2025-10-06 13:17:09.620848687 +0000 UTC m=+809.078796831" Oct 06 13:17:10 crc kubenswrapper[4867]: I1006 13:17:10.599960 4867 generic.go:334] "Generic (PLEG): container finished" podID="e88a96ad-49db-4ffa-b274-9160056bb4c9" containerID="357de6b687127b07889eb46bbd6c4270b4a17d6722507f07f6fece562b30a6e3" exitCode=0 Oct 06 13:17:10 crc kubenswrapper[4867]: I1006 13:17:10.600070 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8wgsb" event={"ID":"e88a96ad-49db-4ffa-b274-9160056bb4c9","Type":"ContainerDied","Data":"357de6b687127b07889eb46bbd6c4270b4a17d6722507f07f6fece562b30a6e3"} Oct 06 13:17:11 crc kubenswrapper[4867]: I1006 13:17:11.008867 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-qs5gj" Oct 06 13:17:11 crc kubenswrapper[4867]: I1006 13:17:11.609120 4867 generic.go:334] "Generic (PLEG): container finished" podID="e88a96ad-49db-4ffa-b274-9160056bb4c9" containerID="5b89c73409abaae22fc4a5296d37d95694db2af2002397c2f34ec0c470304405" exitCode=0 Oct 06 13:17:11 crc kubenswrapper[4867]: I1006 13:17:11.609188 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8wgsb" event={"ID":"e88a96ad-49db-4ffa-b274-9160056bb4c9","Type":"ContainerDied","Data":"5b89c73409abaae22fc4a5296d37d95694db2af2002397c2f34ec0c470304405"} Oct 06 13:17:12 crc kubenswrapper[4867]: I1006 13:17:12.627372 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8wgsb" event={"ID":"e88a96ad-49db-4ffa-b274-9160056bb4c9","Type":"ContainerStarted","Data":"95bd9ba6444f7691baa74c715733f35e4ba660919c12c3cd0542c86592ac8f7a"} Oct 06 13:17:12 crc kubenswrapper[4867]: I1006 13:17:12.627741 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8wgsb" event={"ID":"e88a96ad-49db-4ffa-b274-9160056bb4c9","Type":"ContainerStarted","Data":"cae52ee27e90b88244fa9c81677d45bdfc2579e3789c863451323a914bdb3fe8"} Oct 06 13:17:12 crc kubenswrapper[4867]: I1006 13:17:12.627753 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8wgsb" event={"ID":"e88a96ad-49db-4ffa-b274-9160056bb4c9","Type":"ContainerStarted","Data":"14f573649d7f09221ae48f5066e6aa7bd86c938aeacb967d2c4813dffdc175cf"} Oct 06 13:17:12 crc kubenswrapper[4867]: I1006 13:17:12.627763 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8wgsb" event={"ID":"e88a96ad-49db-4ffa-b274-9160056bb4c9","Type":"ContainerStarted","Data":"6bc7be2b438ba9a6daf76719e43abb383e46e66380dece79d207478e1d9ff2d0"} Oct 06 13:17:12 crc kubenswrapper[4867]: I1006 13:17:12.627775 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8wgsb" event={"ID":"e88a96ad-49db-4ffa-b274-9160056bb4c9","Type":"ContainerStarted","Data":"166eac2f1e2207b1adb2ab20f70e3f5fc4f9a5f940f2c9a7dc6f8cb55a88b106"} Oct 06 13:17:12 crc kubenswrapper[4867]: I1006 13:17:12.873637 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:17:12 crc kubenswrapper[4867]: I1006 13:17:12.873737 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:17:13 crc kubenswrapper[4867]: I1006 13:17:13.097447 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-xn6jt" Oct 06 13:17:13 crc kubenswrapper[4867]: I1006 13:17:13.647019 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8wgsb" event={"ID":"e88a96ad-49db-4ffa-b274-9160056bb4c9","Type":"ContainerStarted","Data":"b97a56f06b3763be0eab569aa56bd56690910116cfd8f17b9e5d54f011ad6774"} Oct 06 13:17:13 crc kubenswrapper[4867]: I1006 13:17:13.647697 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:17:13 crc kubenswrapper[4867]: I1006 13:17:13.679891 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-8wgsb" podStartSLOduration=6.325575813 podStartE2EDuration="14.679866761s" podCreationTimestamp="2025-10-06 13:16:59 +0000 UTC" firstStartedPulling="2025-10-06 13:17:00.108025232 +0000 UTC m=+799.565973376" lastFinishedPulling="2025-10-06 13:17:08.46231619 +0000 UTC m=+807.920264324" observedRunningTime="2025-10-06 13:17:13.677473706 +0000 UTC m=+813.135421870" watchObservedRunningTime="2025-10-06 13:17:13.679866761 +0000 UTC m=+813.137814905" Oct 06 13:17:14 crc kubenswrapper[4867]: I1006 13:17:14.991159 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:17:15 crc kubenswrapper[4867]: I1006 13:17:15.026687 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:17:15 crc kubenswrapper[4867]: I1006 13:17:15.840610 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-svnrt"] Oct 06 13:17:15 crc kubenswrapper[4867]: I1006 13:17:15.841538 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-svnrt" Oct 06 13:17:15 crc kubenswrapper[4867]: I1006 13:17:15.846343 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-fz5jw" Oct 06 13:17:15 crc kubenswrapper[4867]: I1006 13:17:15.846351 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 06 13:17:15 crc kubenswrapper[4867]: I1006 13:17:15.850274 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 06 13:17:15 crc kubenswrapper[4867]: I1006 13:17:15.856477 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-svnrt"] Oct 06 13:17:15 crc kubenswrapper[4867]: I1006 13:17:15.876071 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dzkm\" (UniqueName: \"kubernetes.io/projected/5054eef5-e295-448e-b892-de2d4c63db76-kube-api-access-7dzkm\") pod \"openstack-operator-index-svnrt\" (UID: \"5054eef5-e295-448e-b892-de2d4c63db76\") " pod="openstack-operators/openstack-operator-index-svnrt" Oct 06 13:17:15 crc kubenswrapper[4867]: I1006 13:17:15.977669 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dzkm\" (UniqueName: \"kubernetes.io/projected/5054eef5-e295-448e-b892-de2d4c63db76-kube-api-access-7dzkm\") pod \"openstack-operator-index-svnrt\" (UID: \"5054eef5-e295-448e-b892-de2d4c63db76\") " pod="openstack-operators/openstack-operator-index-svnrt" Oct 06 13:17:16 crc kubenswrapper[4867]: I1006 13:17:16.004781 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dzkm\" (UniqueName: \"kubernetes.io/projected/5054eef5-e295-448e-b892-de2d4c63db76-kube-api-access-7dzkm\") pod \"openstack-operator-index-svnrt\" (UID: \"5054eef5-e295-448e-b892-de2d4c63db76\") " pod="openstack-operators/openstack-operator-index-svnrt" Oct 06 13:17:16 crc kubenswrapper[4867]: I1006 13:17:16.175940 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-svnrt" Oct 06 13:17:16 crc kubenswrapper[4867]: I1006 13:17:16.608196 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-svnrt"] Oct 06 13:17:16 crc kubenswrapper[4867]: I1006 13:17:16.676804 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-svnrt" event={"ID":"5054eef5-e295-448e-b892-de2d4c63db76","Type":"ContainerStarted","Data":"e6f568c04b2ecc2d213b6c26c73aee2cabde459d8eaf94c816243dfda86ab81e"} Oct 06 13:17:19 crc kubenswrapper[4867]: I1006 13:17:19.413087 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xdc52" Oct 06 13:17:19 crc kubenswrapper[4867]: I1006 13:17:19.700911 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-svnrt" event={"ID":"5054eef5-e295-448e-b892-de2d4c63db76","Type":"ContainerStarted","Data":"efa24fdae815504f368bee783fd6a8c449e09632002414bd6284b749d5da8dc1"} Oct 06 13:17:20 crc kubenswrapper[4867]: I1006 13:17:20.009894 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-svnrt" podStartSLOduration=2.706329877 podStartE2EDuration="5.009867583s" podCreationTimestamp="2025-10-06 13:17:15 +0000 UTC" firstStartedPulling="2025-10-06 13:17:16.61740974 +0000 UTC m=+816.075357884" lastFinishedPulling="2025-10-06 13:17:18.920947426 +0000 UTC m=+818.378895590" observedRunningTime="2025-10-06 13:17:19.716729994 +0000 UTC m=+819.174678138" watchObservedRunningTime="2025-10-06 13:17:20.009867583 +0000 UTC m=+819.467815737" Oct 06 13:17:20 crc kubenswrapper[4867]: I1006 13:17:20.015545 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-svnrt"] Oct 06 13:17:20 crc kubenswrapper[4867]: I1006 13:17:20.825548 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7ptcs"] Oct 06 13:17:20 crc kubenswrapper[4867]: I1006 13:17:20.826859 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7ptcs" Oct 06 13:17:20 crc kubenswrapper[4867]: I1006 13:17:20.841651 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7ptcs"] Oct 06 13:17:20 crc kubenswrapper[4867]: I1006 13:17:20.963236 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxj9w\" (UniqueName: \"kubernetes.io/projected/160e7b7c-4f2e-4dba-99a5-35c4d3d9868d-kube-api-access-qxj9w\") pod \"openstack-operator-index-7ptcs\" (UID: \"160e7b7c-4f2e-4dba-99a5-35c4d3d9868d\") " pod="openstack-operators/openstack-operator-index-7ptcs" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.064996 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxj9w\" (UniqueName: \"kubernetes.io/projected/160e7b7c-4f2e-4dba-99a5-35c4d3d9868d-kube-api-access-qxj9w\") pod \"openstack-operator-index-7ptcs\" (UID: \"160e7b7c-4f2e-4dba-99a5-35c4d3d9868d\") " pod="openstack-operators/openstack-operator-index-7ptcs" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.098103 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxj9w\" (UniqueName: \"kubernetes.io/projected/160e7b7c-4f2e-4dba-99a5-35c4d3d9868d-kube-api-access-qxj9w\") pod \"openstack-operator-index-7ptcs\" (UID: \"160e7b7c-4f2e-4dba-99a5-35c4d3d9868d\") " pod="openstack-operators/openstack-operator-index-7ptcs" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.147705 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7ptcs" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.232824 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d459h"] Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.234725 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.244953 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d459h"] Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.387600 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a45871bb-a8b0-4f5e-aa7b-1416a806140f-utilities\") pod \"redhat-marketplace-d459h\" (UID: \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\") " pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.387697 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a45871bb-a8b0-4f5e-aa7b-1416a806140f-catalog-content\") pod \"redhat-marketplace-d459h\" (UID: \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\") " pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.387774 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gkpp\" (UniqueName: \"kubernetes.io/projected/a45871bb-a8b0-4f5e-aa7b-1416a806140f-kube-api-access-4gkpp\") pod \"redhat-marketplace-d459h\" (UID: \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\") " pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.489350 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gkpp\" (UniqueName: \"kubernetes.io/projected/a45871bb-a8b0-4f5e-aa7b-1416a806140f-kube-api-access-4gkpp\") pod \"redhat-marketplace-d459h\" (UID: \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\") " pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.489474 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a45871bb-a8b0-4f5e-aa7b-1416a806140f-utilities\") pod \"redhat-marketplace-d459h\" (UID: \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\") " pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.489535 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a45871bb-a8b0-4f5e-aa7b-1416a806140f-catalog-content\") pod \"redhat-marketplace-d459h\" (UID: \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\") " pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.490287 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a45871bb-a8b0-4f5e-aa7b-1416a806140f-catalog-content\") pod \"redhat-marketplace-d459h\" (UID: \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\") " pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.490359 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a45871bb-a8b0-4f5e-aa7b-1416a806140f-utilities\") pod \"redhat-marketplace-d459h\" (UID: \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\") " pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.518097 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gkpp\" (UniqueName: \"kubernetes.io/projected/a45871bb-a8b0-4f5e-aa7b-1416a806140f-kube-api-access-4gkpp\") pod \"redhat-marketplace-d459h\" (UID: \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\") " pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.563486 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.618000 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7ptcs"] Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.725089 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7ptcs" event={"ID":"160e7b7c-4f2e-4dba-99a5-35c4d3d9868d","Type":"ContainerStarted","Data":"df61eb09069c433151d7139272f9fc37b04e47a5359540c1aaeb443614cadf72"} Oct 06 13:17:21 crc kubenswrapper[4867]: I1006 13:17:21.725175 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-svnrt" podUID="5054eef5-e295-448e-b892-de2d4c63db76" containerName="registry-server" containerID="cri-o://efa24fdae815504f368bee783fd6a8c449e09632002414bd6284b749d5da8dc1" gracePeriod=2 Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.005578 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d459h"] Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.071141 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-svnrt" Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.200234 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dzkm\" (UniqueName: \"kubernetes.io/projected/5054eef5-e295-448e-b892-de2d4c63db76-kube-api-access-7dzkm\") pod \"5054eef5-e295-448e-b892-de2d4c63db76\" (UID: \"5054eef5-e295-448e-b892-de2d4c63db76\") " Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.206553 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5054eef5-e295-448e-b892-de2d4c63db76-kube-api-access-7dzkm" (OuterVolumeSpecName: "kube-api-access-7dzkm") pod "5054eef5-e295-448e-b892-de2d4c63db76" (UID: "5054eef5-e295-448e-b892-de2d4c63db76"). InnerVolumeSpecName "kube-api-access-7dzkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.302222 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dzkm\" (UniqueName: \"kubernetes.io/projected/5054eef5-e295-448e-b892-de2d4c63db76-kube-api-access-7dzkm\") on node \"crc\" DevicePath \"\"" Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.735340 4867 generic.go:334] "Generic (PLEG): container finished" podID="a45871bb-a8b0-4f5e-aa7b-1416a806140f" containerID="fc37fc869f99cc4b88baa947884dd775aca2fe7a6e25d8e41a04da25f7b825c5" exitCode=0 Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.735389 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d459h" event={"ID":"a45871bb-a8b0-4f5e-aa7b-1416a806140f","Type":"ContainerDied","Data":"fc37fc869f99cc4b88baa947884dd775aca2fe7a6e25d8e41a04da25f7b825c5"} Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.735718 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d459h" event={"ID":"a45871bb-a8b0-4f5e-aa7b-1416a806140f","Type":"ContainerStarted","Data":"8e465d42cd05924f3a5fc9926a23dfd09966b9057b50b0f1354602e4a179a07b"} Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.739034 4867 generic.go:334] "Generic (PLEG): container finished" podID="5054eef5-e295-448e-b892-de2d4c63db76" containerID="efa24fdae815504f368bee783fd6a8c449e09632002414bd6284b749d5da8dc1" exitCode=0 Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.739114 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-svnrt" Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.741481 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-svnrt" event={"ID":"5054eef5-e295-448e-b892-de2d4c63db76","Type":"ContainerDied","Data":"efa24fdae815504f368bee783fd6a8c449e09632002414bd6284b749d5da8dc1"} Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.741655 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-svnrt" event={"ID":"5054eef5-e295-448e-b892-de2d4c63db76","Type":"ContainerDied","Data":"e6f568c04b2ecc2d213b6c26c73aee2cabde459d8eaf94c816243dfda86ab81e"} Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.741696 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7ptcs" event={"ID":"160e7b7c-4f2e-4dba-99a5-35c4d3d9868d","Type":"ContainerStarted","Data":"3f79788d98361103b788d7722013412f62e23cece9ac28f8a2c676b3dba44c46"} Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.741740 4867 scope.go:117] "RemoveContainer" containerID="efa24fdae815504f368bee783fd6a8c449e09632002414bd6284b749d5da8dc1" Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.759850 4867 scope.go:117] "RemoveContainer" containerID="efa24fdae815504f368bee783fd6a8c449e09632002414bd6284b749d5da8dc1" Oct 06 13:17:22 crc kubenswrapper[4867]: E1006 13:17:22.760559 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efa24fdae815504f368bee783fd6a8c449e09632002414bd6284b749d5da8dc1\": container with ID starting with efa24fdae815504f368bee783fd6a8c449e09632002414bd6284b749d5da8dc1 not found: ID does not exist" containerID="efa24fdae815504f368bee783fd6a8c449e09632002414bd6284b749d5da8dc1" Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.760600 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa24fdae815504f368bee783fd6a8c449e09632002414bd6284b749d5da8dc1"} err="failed to get container status \"efa24fdae815504f368bee783fd6a8c449e09632002414bd6284b749d5da8dc1\": rpc error: code = NotFound desc = could not find container \"efa24fdae815504f368bee783fd6a8c449e09632002414bd6284b749d5da8dc1\": container with ID starting with efa24fdae815504f368bee783fd6a8c449e09632002414bd6284b749d5da8dc1 not found: ID does not exist" Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.778228 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7ptcs" podStartSLOduration=2.708601905 podStartE2EDuration="2.778204034s" podCreationTimestamp="2025-10-06 13:17:20 +0000 UTC" firstStartedPulling="2025-10-06 13:17:21.642071438 +0000 UTC m=+821.100019582" lastFinishedPulling="2025-10-06 13:17:21.711673557 +0000 UTC m=+821.169621711" observedRunningTime="2025-10-06 13:17:22.77509998 +0000 UTC m=+822.233048144" watchObservedRunningTime="2025-10-06 13:17:22.778204034 +0000 UTC m=+822.236152178" Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.796362 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-svnrt"] Oct 06 13:17:22 crc kubenswrapper[4867]: I1006 13:17:22.801435 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-svnrt"] Oct 06 13:17:23 crc kubenswrapper[4867]: I1006 13:17:23.233223 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5054eef5-e295-448e-b892-de2d4c63db76" path="/var/lib/kubelet/pods/5054eef5-e295-448e-b892-de2d4c63db76/volumes" Oct 06 13:17:23 crc kubenswrapper[4867]: I1006 13:17:23.755871 4867 generic.go:334] "Generic (PLEG): container finished" podID="a45871bb-a8b0-4f5e-aa7b-1416a806140f" containerID="7b49b772588fb7adf3e614320cc20138e69f066410824a1c5520b1f6424ef7d8" exitCode=0 Oct 06 13:17:23 crc kubenswrapper[4867]: I1006 13:17:23.755940 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d459h" event={"ID":"a45871bb-a8b0-4f5e-aa7b-1416a806140f","Type":"ContainerDied","Data":"7b49b772588fb7adf3e614320cc20138e69f066410824a1c5520b1f6424ef7d8"} Oct 06 13:17:24 crc kubenswrapper[4867]: I1006 13:17:24.769484 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d459h" event={"ID":"a45871bb-a8b0-4f5e-aa7b-1416a806140f","Type":"ContainerStarted","Data":"71b94b08565fd03ada5fe328c5a2dca2718809fbd4556d85061cae4fbf46b6ce"} Oct 06 13:17:24 crc kubenswrapper[4867]: I1006 13:17:24.792849 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d459h" podStartSLOduration=2.320067563 podStartE2EDuration="3.792824915s" podCreationTimestamp="2025-10-06 13:17:21 +0000 UTC" firstStartedPulling="2025-10-06 13:17:22.737834333 +0000 UTC m=+822.195782517" lastFinishedPulling="2025-10-06 13:17:24.210591725 +0000 UTC m=+823.668539869" observedRunningTime="2025-10-06 13:17:24.788776015 +0000 UTC m=+824.246724189" watchObservedRunningTime="2025-10-06 13:17:24.792824915 +0000 UTC m=+824.250773079" Oct 06 13:17:29 crc kubenswrapper[4867]: I1006 13:17:29.994700 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-8wgsb" Oct 06 13:17:31 crc kubenswrapper[4867]: I1006 13:17:31.148812 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-7ptcs" Oct 06 13:17:31 crc kubenswrapper[4867]: I1006 13:17:31.149302 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-7ptcs" Oct 06 13:17:31 crc kubenswrapper[4867]: I1006 13:17:31.186982 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-7ptcs" Oct 06 13:17:31 crc kubenswrapper[4867]: I1006 13:17:31.564365 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:31 crc kubenswrapper[4867]: I1006 13:17:31.564460 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:31 crc kubenswrapper[4867]: I1006 13:17:31.626977 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:31 crc kubenswrapper[4867]: I1006 13:17:31.863177 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-7ptcs" Oct 06 13:17:31 crc kubenswrapper[4867]: I1006 13:17:31.890046 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:32 crc kubenswrapper[4867]: I1006 13:17:32.815287 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d459h"] Oct 06 13:17:33 crc kubenswrapper[4867]: I1006 13:17:33.839676 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d459h" podUID="a45871bb-a8b0-4f5e-aa7b-1416a806140f" containerName="registry-server" containerID="cri-o://71b94b08565fd03ada5fe328c5a2dca2718809fbd4556d85061cae4fbf46b6ce" gracePeriod=2 Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.298628 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.407735 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gkpp\" (UniqueName: \"kubernetes.io/projected/a45871bb-a8b0-4f5e-aa7b-1416a806140f-kube-api-access-4gkpp\") pod \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\" (UID: \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\") " Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.407796 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a45871bb-a8b0-4f5e-aa7b-1416a806140f-catalog-content\") pod \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\" (UID: \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\") " Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.407849 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a45871bb-a8b0-4f5e-aa7b-1416a806140f-utilities\") pod \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\" (UID: \"a45871bb-a8b0-4f5e-aa7b-1416a806140f\") " Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.409155 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a45871bb-a8b0-4f5e-aa7b-1416a806140f-utilities" (OuterVolumeSpecName: "utilities") pod "a45871bb-a8b0-4f5e-aa7b-1416a806140f" (UID: "a45871bb-a8b0-4f5e-aa7b-1416a806140f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.425859 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fw5nc"] Oct 06 13:17:34 crc kubenswrapper[4867]: E1006 13:17:34.426235 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45871bb-a8b0-4f5e-aa7b-1416a806140f" containerName="extract-utilities" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.426290 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45871bb-a8b0-4f5e-aa7b-1416a806140f" containerName="extract-utilities" Oct 06 13:17:34 crc kubenswrapper[4867]: E1006 13:17:34.426311 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45871bb-a8b0-4f5e-aa7b-1416a806140f" containerName="extract-content" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.426320 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45871bb-a8b0-4f5e-aa7b-1416a806140f" containerName="extract-content" Oct 06 13:17:34 crc kubenswrapper[4867]: E1006 13:17:34.426346 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5054eef5-e295-448e-b892-de2d4c63db76" containerName="registry-server" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.426357 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5054eef5-e295-448e-b892-de2d4c63db76" containerName="registry-server" Oct 06 13:17:34 crc kubenswrapper[4867]: E1006 13:17:34.426368 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45871bb-a8b0-4f5e-aa7b-1416a806140f" containerName="registry-server" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.426378 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45871bb-a8b0-4f5e-aa7b-1416a806140f" containerName="registry-server" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.426531 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45871bb-a8b0-4f5e-aa7b-1416a806140f" containerName="registry-server" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.426560 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5054eef5-e295-448e-b892-de2d4c63db76" containerName="registry-server" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.429831 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45871bb-a8b0-4f5e-aa7b-1416a806140f-kube-api-access-4gkpp" (OuterVolumeSpecName: "kube-api-access-4gkpp") pod "a45871bb-a8b0-4f5e-aa7b-1416a806140f" (UID: "a45871bb-a8b0-4f5e-aa7b-1416a806140f"). InnerVolumeSpecName "kube-api-access-4gkpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.432046 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.440941 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a45871bb-a8b0-4f5e-aa7b-1416a806140f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a45871bb-a8b0-4f5e-aa7b-1416a806140f" (UID: "a45871bb-a8b0-4f5e-aa7b-1416a806140f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.443971 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fw5nc"] Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.509683 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-utilities\") pod \"redhat-operators-fw5nc\" (UID: \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\") " pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.509760 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tklj\" (UniqueName: \"kubernetes.io/projected/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-kube-api-access-9tklj\") pod \"redhat-operators-fw5nc\" (UID: \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\") " pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.509878 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-catalog-content\") pod \"redhat-operators-fw5nc\" (UID: \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\") " pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.509951 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gkpp\" (UniqueName: \"kubernetes.io/projected/a45871bb-a8b0-4f5e-aa7b-1416a806140f-kube-api-access-4gkpp\") on node \"crc\" DevicePath \"\"" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.509962 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a45871bb-a8b0-4f5e-aa7b-1416a806140f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.509972 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a45871bb-a8b0-4f5e-aa7b-1416a806140f-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.611880 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-catalog-content\") pod \"redhat-operators-fw5nc\" (UID: \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\") " pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.611960 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-utilities\") pod \"redhat-operators-fw5nc\" (UID: \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\") " pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.612025 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tklj\" (UniqueName: \"kubernetes.io/projected/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-kube-api-access-9tklj\") pod \"redhat-operators-fw5nc\" (UID: \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\") " pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.612540 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-catalog-content\") pod \"redhat-operators-fw5nc\" (UID: \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\") " pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.612560 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-utilities\") pod \"redhat-operators-fw5nc\" (UID: \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\") " pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.631132 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tklj\" (UniqueName: \"kubernetes.io/projected/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-kube-api-access-9tklj\") pod \"redhat-operators-fw5nc\" (UID: \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\") " pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.775963 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.848909 4867 generic.go:334] "Generic (PLEG): container finished" podID="a45871bb-a8b0-4f5e-aa7b-1416a806140f" containerID="71b94b08565fd03ada5fe328c5a2dca2718809fbd4556d85061cae4fbf46b6ce" exitCode=0 Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.848973 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d459h" event={"ID":"a45871bb-a8b0-4f5e-aa7b-1416a806140f","Type":"ContainerDied","Data":"71b94b08565fd03ada5fe328c5a2dca2718809fbd4556d85061cae4fbf46b6ce"} Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.848981 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d459h" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.849043 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d459h" event={"ID":"a45871bb-a8b0-4f5e-aa7b-1416a806140f","Type":"ContainerDied","Data":"8e465d42cd05924f3a5fc9926a23dfd09966b9057b50b0f1354602e4a179a07b"} Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.849076 4867 scope.go:117] "RemoveContainer" containerID="71b94b08565fd03ada5fe328c5a2dca2718809fbd4556d85061cae4fbf46b6ce" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.871224 4867 scope.go:117] "RemoveContainer" containerID="7b49b772588fb7adf3e614320cc20138e69f066410824a1c5520b1f6424ef7d8" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.886697 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d459h"] Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.891119 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d459h"] Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.914439 4867 scope.go:117] "RemoveContainer" containerID="fc37fc869f99cc4b88baa947884dd775aca2fe7a6e25d8e41a04da25f7b825c5" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.936550 4867 scope.go:117] "RemoveContainer" containerID="71b94b08565fd03ada5fe328c5a2dca2718809fbd4556d85061cae4fbf46b6ce" Oct 06 13:17:34 crc kubenswrapper[4867]: E1006 13:17:34.937091 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71b94b08565fd03ada5fe328c5a2dca2718809fbd4556d85061cae4fbf46b6ce\": container with ID starting with 71b94b08565fd03ada5fe328c5a2dca2718809fbd4556d85061cae4fbf46b6ce not found: ID does not exist" containerID="71b94b08565fd03ada5fe328c5a2dca2718809fbd4556d85061cae4fbf46b6ce" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.937162 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71b94b08565fd03ada5fe328c5a2dca2718809fbd4556d85061cae4fbf46b6ce"} err="failed to get container status \"71b94b08565fd03ada5fe328c5a2dca2718809fbd4556d85061cae4fbf46b6ce\": rpc error: code = NotFound desc = could not find container \"71b94b08565fd03ada5fe328c5a2dca2718809fbd4556d85061cae4fbf46b6ce\": container with ID starting with 71b94b08565fd03ada5fe328c5a2dca2718809fbd4556d85061cae4fbf46b6ce not found: ID does not exist" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.937203 4867 scope.go:117] "RemoveContainer" containerID="7b49b772588fb7adf3e614320cc20138e69f066410824a1c5520b1f6424ef7d8" Oct 06 13:17:34 crc kubenswrapper[4867]: E1006 13:17:34.937990 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b49b772588fb7adf3e614320cc20138e69f066410824a1c5520b1f6424ef7d8\": container with ID starting with 7b49b772588fb7adf3e614320cc20138e69f066410824a1c5520b1f6424ef7d8 not found: ID does not exist" containerID="7b49b772588fb7adf3e614320cc20138e69f066410824a1c5520b1f6424ef7d8" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.938041 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b49b772588fb7adf3e614320cc20138e69f066410824a1c5520b1f6424ef7d8"} err="failed to get container status \"7b49b772588fb7adf3e614320cc20138e69f066410824a1c5520b1f6424ef7d8\": rpc error: code = NotFound desc = could not find container \"7b49b772588fb7adf3e614320cc20138e69f066410824a1c5520b1f6424ef7d8\": container with ID starting with 7b49b772588fb7adf3e614320cc20138e69f066410824a1c5520b1f6424ef7d8 not found: ID does not exist" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.938073 4867 scope.go:117] "RemoveContainer" containerID="fc37fc869f99cc4b88baa947884dd775aca2fe7a6e25d8e41a04da25f7b825c5" Oct 06 13:17:34 crc kubenswrapper[4867]: E1006 13:17:34.938596 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc37fc869f99cc4b88baa947884dd775aca2fe7a6e25d8e41a04da25f7b825c5\": container with ID starting with fc37fc869f99cc4b88baa947884dd775aca2fe7a6e25d8e41a04da25f7b825c5 not found: ID does not exist" containerID="fc37fc869f99cc4b88baa947884dd775aca2fe7a6e25d8e41a04da25f7b825c5" Oct 06 13:17:34 crc kubenswrapper[4867]: I1006 13:17:34.938623 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc37fc869f99cc4b88baa947884dd775aca2fe7a6e25d8e41a04da25f7b825c5"} err="failed to get container status \"fc37fc869f99cc4b88baa947884dd775aca2fe7a6e25d8e41a04da25f7b825c5\": rpc error: code = NotFound desc = could not find container \"fc37fc869f99cc4b88baa947884dd775aca2fe7a6e25d8e41a04da25f7b825c5\": container with ID starting with fc37fc869f99cc4b88baa947884dd775aca2fe7a6e25d8e41a04da25f7b825c5 not found: ID does not exist" Oct 06 13:17:35 crc kubenswrapper[4867]: I1006 13:17:35.230176 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a45871bb-a8b0-4f5e-aa7b-1416a806140f" path="/var/lib/kubelet/pods/a45871bb-a8b0-4f5e-aa7b-1416a806140f/volumes" Oct 06 13:17:35 crc kubenswrapper[4867]: I1006 13:17:35.242811 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fw5nc"] Oct 06 13:17:35 crc kubenswrapper[4867]: I1006 13:17:35.856616 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw5nc" event={"ID":"a8c5bde4-82ee-4ac7-9e51-c77f809776e6","Type":"ContainerStarted","Data":"030a0b654de36fe71c0eaad9f39c76d8175076f95b4ee71e4b3ebf9e6366a0b2"} Oct 06 13:17:36 crc kubenswrapper[4867]: I1006 13:17:36.870651 4867 generic.go:334] "Generic (PLEG): container finished" podID="a8c5bde4-82ee-4ac7-9e51-c77f809776e6" containerID="f02484e89f9ca2cd5e50db140a566540dfb34c28e2be3ac339bbc931bc7b91cd" exitCode=0 Oct 06 13:17:36 crc kubenswrapper[4867]: I1006 13:17:36.870778 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw5nc" event={"ID":"a8c5bde4-82ee-4ac7-9e51-c77f809776e6","Type":"ContainerDied","Data":"f02484e89f9ca2cd5e50db140a566540dfb34c28e2be3ac339bbc931bc7b91cd"} Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.677300 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b"] Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.679562 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.682161 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-psc47" Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.694276 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b"] Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.777533 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-bundle\") pod \"2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b\" (UID: \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\") " pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.777614 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljfwx\" (UniqueName: \"kubernetes.io/projected/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-kube-api-access-ljfwx\") pod \"2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b\" (UID: \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\") " pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.777681 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-util\") pod \"2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b\" (UID: \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\") " pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.879850 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-bundle\") pod \"2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b\" (UID: \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\") " pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.880009 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljfwx\" (UniqueName: \"kubernetes.io/projected/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-kube-api-access-ljfwx\") pod \"2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b\" (UID: \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\") " pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.880081 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-util\") pod \"2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b\" (UID: \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\") " pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.880551 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-bundle\") pod \"2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b\" (UID: \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\") " pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.880578 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-util\") pod \"2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b\" (UID: \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\") " pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.900866 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljfwx\" (UniqueName: \"kubernetes.io/projected/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-kube-api-access-ljfwx\") pod \"2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b\" (UID: \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\") " pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.907158 4867 generic.go:334] "Generic (PLEG): container finished" podID="a8c5bde4-82ee-4ac7-9e51-c77f809776e6" containerID="85aabb0edf77e4575dce0a838aad71b3a61a3bb1ffa9695d92db6f2a34c5c40f" exitCode=0 Oct 06 13:17:38 crc kubenswrapper[4867]: I1006 13:17:38.907215 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw5nc" event={"ID":"a8c5bde4-82ee-4ac7-9e51-c77f809776e6","Type":"ContainerDied","Data":"85aabb0edf77e4575dce0a838aad71b3a61a3bb1ffa9695d92db6f2a34c5c40f"} Oct 06 13:17:39 crc kubenswrapper[4867]: I1006 13:17:39.001768 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" Oct 06 13:17:39 crc kubenswrapper[4867]: I1006 13:17:39.443287 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b"] Oct 06 13:17:39 crc kubenswrapper[4867]: I1006 13:17:39.919799 4867 generic.go:334] "Generic (PLEG): container finished" podID="b09c33a7-5fbd-4594-bebf-bcb9a5519e89" containerID="92ea5c0513d3a05a84f86a7e1cb60d60bd9f21528aa3a9e955c6a7a7d409bb84" exitCode=0 Oct 06 13:17:39 crc kubenswrapper[4867]: I1006 13:17:39.919861 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" event={"ID":"b09c33a7-5fbd-4594-bebf-bcb9a5519e89","Type":"ContainerDied","Data":"92ea5c0513d3a05a84f86a7e1cb60d60bd9f21528aa3a9e955c6a7a7d409bb84"} Oct 06 13:17:39 crc kubenswrapper[4867]: I1006 13:17:39.919902 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" event={"ID":"b09c33a7-5fbd-4594-bebf-bcb9a5519e89","Type":"ContainerStarted","Data":"e87d5117dccf80886a662890e562edcb4eb4cd1d30194e6a9cb4fea02dc3c675"} Oct 06 13:17:40 crc kubenswrapper[4867]: I1006 13:17:40.930081 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw5nc" event={"ID":"a8c5bde4-82ee-4ac7-9e51-c77f809776e6","Type":"ContainerStarted","Data":"4310ad380c52ad5a523412176f032fae6e19e28f7f40a7ff32baeb3d84a94ee2"} Oct 06 13:17:40 crc kubenswrapper[4867]: I1006 13:17:40.931740 4867 generic.go:334] "Generic (PLEG): container finished" podID="b09c33a7-5fbd-4594-bebf-bcb9a5519e89" containerID="dc010ce86a249a49658b8c6077956c43180d2f27ebce7a5dd14b7c142164dcee" exitCode=0 Oct 06 13:17:40 crc kubenswrapper[4867]: I1006 13:17:40.931809 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" event={"ID":"b09c33a7-5fbd-4594-bebf-bcb9a5519e89","Type":"ContainerDied","Data":"dc010ce86a249a49658b8c6077956c43180d2f27ebce7a5dd14b7c142164dcee"} Oct 06 13:17:40 crc kubenswrapper[4867]: I1006 13:17:40.948623 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fw5nc" podStartSLOduration=3.978032094 podStartE2EDuration="6.948600493s" podCreationTimestamp="2025-10-06 13:17:34 +0000 UTC" firstStartedPulling="2025-10-06 13:17:36.873333935 +0000 UTC m=+836.331282119" lastFinishedPulling="2025-10-06 13:17:39.843902374 +0000 UTC m=+839.301850518" observedRunningTime="2025-10-06 13:17:40.948382077 +0000 UTC m=+840.406330221" watchObservedRunningTime="2025-10-06 13:17:40.948600493 +0000 UTC m=+840.406548637" Oct 06 13:17:41 crc kubenswrapper[4867]: I1006 13:17:41.940861 4867 generic.go:334] "Generic (PLEG): container finished" podID="b09c33a7-5fbd-4594-bebf-bcb9a5519e89" containerID="e3fe0c1a8764130fa3c348ab5430c16de37dfe3730405adc70881a5afe410a59" exitCode=0 Oct 06 13:17:41 crc kubenswrapper[4867]: I1006 13:17:41.940963 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" event={"ID":"b09c33a7-5fbd-4594-bebf-bcb9a5519e89","Type":"ContainerDied","Data":"e3fe0c1a8764130fa3c348ab5430c16de37dfe3730405adc70881a5afe410a59"} Oct 06 13:17:42 crc kubenswrapper[4867]: I1006 13:17:42.874038 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:17:42 crc kubenswrapper[4867]: I1006 13:17:42.874844 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:17:42 crc kubenswrapper[4867]: I1006 13:17:42.874947 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:17:42 crc kubenswrapper[4867]: I1006 13:17:42.876319 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"21254038d2e08625414f4e3fd77d4aa603650bf9aa5cea1080c49abec73a2651"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:17:42 crc kubenswrapper[4867]: I1006 13:17:42.876465 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://21254038d2e08625414f4e3fd77d4aa603650bf9aa5cea1080c49abec73a2651" gracePeriod=600 Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.256778 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.348916 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljfwx\" (UniqueName: \"kubernetes.io/projected/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-kube-api-access-ljfwx\") pod \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\" (UID: \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\") " Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.349065 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-bundle\") pod \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\" (UID: \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\") " Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.349141 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-util\") pod \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\" (UID: \"b09c33a7-5fbd-4594-bebf-bcb9a5519e89\") " Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.350761 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-bundle" (OuterVolumeSpecName: "bundle") pod "b09c33a7-5fbd-4594-bebf-bcb9a5519e89" (UID: "b09c33a7-5fbd-4594-bebf-bcb9a5519e89"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.359117 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-kube-api-access-ljfwx" (OuterVolumeSpecName: "kube-api-access-ljfwx") pod "b09c33a7-5fbd-4594-bebf-bcb9a5519e89" (UID: "b09c33a7-5fbd-4594-bebf-bcb9a5519e89"). InnerVolumeSpecName "kube-api-access-ljfwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.365002 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-util" (OuterVolumeSpecName: "util") pod "b09c33a7-5fbd-4594-bebf-bcb9a5519e89" (UID: "b09c33a7-5fbd-4594-bebf-bcb9a5519e89"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.451388 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljfwx\" (UniqueName: \"kubernetes.io/projected/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-kube-api-access-ljfwx\") on node \"crc\" DevicePath \"\"" Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.451458 4867 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.451472 4867 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b09c33a7-5fbd-4594-bebf-bcb9a5519e89-util\") on node \"crc\" DevicePath \"\"" Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.959408 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" event={"ID":"b09c33a7-5fbd-4594-bebf-bcb9a5519e89","Type":"ContainerDied","Data":"e87d5117dccf80886a662890e562edcb4eb4cd1d30194e6a9cb4fea02dc3c675"} Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.959968 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e87d5117dccf80886a662890e562edcb4eb4cd1d30194e6a9cb4fea02dc3c675" Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.959505 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b" Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.962913 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="21254038d2e08625414f4e3fd77d4aa603650bf9aa5cea1080c49abec73a2651" exitCode=0 Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.962947 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"21254038d2e08625414f4e3fd77d4aa603650bf9aa5cea1080c49abec73a2651"} Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.962964 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"0a46508d237859c347210237945b8f376811db88e9f318300207a6c9aaeafb5d"} Oct 06 13:17:43 crc kubenswrapper[4867]: I1006 13:17:43.962983 4867 scope.go:117] "RemoveContainer" containerID="9e6c0a282c79916c755a5288bb4eb5014e330cf8dede8b87679a8dc2b50be474" Oct 06 13:17:44 crc kubenswrapper[4867]: I1006 13:17:44.776919 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:44 crc kubenswrapper[4867]: I1006 13:17:44.776994 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:44 crc kubenswrapper[4867]: I1006 13:17:44.832335 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:45 crc kubenswrapper[4867]: I1006 13:17:45.029642 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:47 crc kubenswrapper[4867]: I1006 13:17:47.215916 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fw5nc"] Oct 06 13:17:47 crc kubenswrapper[4867]: I1006 13:17:47.216311 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fw5nc" podUID="a8c5bde4-82ee-4ac7-9e51-c77f809776e6" containerName="registry-server" containerID="cri-o://4310ad380c52ad5a523412176f032fae6e19e28f7f40a7ff32baeb3d84a94ee2" gracePeriod=2 Oct 06 13:17:47 crc kubenswrapper[4867]: I1006 13:17:47.647893 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:47 crc kubenswrapper[4867]: I1006 13:17:47.714589 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-utilities\") pod \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\" (UID: \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\") " Oct 06 13:17:47 crc kubenswrapper[4867]: I1006 13:17:47.714715 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tklj\" (UniqueName: \"kubernetes.io/projected/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-kube-api-access-9tklj\") pod \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\" (UID: \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\") " Oct 06 13:17:47 crc kubenswrapper[4867]: I1006 13:17:47.714796 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-catalog-content\") pod \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\" (UID: \"a8c5bde4-82ee-4ac7-9e51-c77f809776e6\") " Oct 06 13:17:47 crc kubenswrapper[4867]: I1006 13:17:47.716018 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-utilities" (OuterVolumeSpecName: "utilities") pod "a8c5bde4-82ee-4ac7-9e51-c77f809776e6" (UID: "a8c5bde4-82ee-4ac7-9e51-c77f809776e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:17:47 crc kubenswrapper[4867]: I1006 13:17:47.724401 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-kube-api-access-9tklj" (OuterVolumeSpecName: "kube-api-access-9tklj") pod "a8c5bde4-82ee-4ac7-9e51-c77f809776e6" (UID: "a8c5bde4-82ee-4ac7-9e51-c77f809776e6"). InnerVolumeSpecName "kube-api-access-9tklj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:17:47 crc kubenswrapper[4867]: I1006 13:17:47.816889 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:17:47 crc kubenswrapper[4867]: I1006 13:17:47.816927 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tklj\" (UniqueName: \"kubernetes.io/projected/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-kube-api-access-9tklj\") on node \"crc\" DevicePath \"\"" Oct 06 13:17:47 crc kubenswrapper[4867]: I1006 13:17:47.831784 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8c5bde4-82ee-4ac7-9e51-c77f809776e6" (UID: "a8c5bde4-82ee-4ac7-9e51-c77f809776e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:17:47 crc kubenswrapper[4867]: I1006 13:17:47.918649 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8c5bde4-82ee-4ac7-9e51-c77f809776e6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.010636 4867 generic.go:334] "Generic (PLEG): container finished" podID="a8c5bde4-82ee-4ac7-9e51-c77f809776e6" containerID="4310ad380c52ad5a523412176f032fae6e19e28f7f40a7ff32baeb3d84a94ee2" exitCode=0 Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.010699 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw5nc" event={"ID":"a8c5bde4-82ee-4ac7-9e51-c77f809776e6","Type":"ContainerDied","Data":"4310ad380c52ad5a523412176f032fae6e19e28f7f40a7ff32baeb3d84a94ee2"} Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.010734 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw5nc" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.010743 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw5nc" event={"ID":"a8c5bde4-82ee-4ac7-9e51-c77f809776e6","Type":"ContainerDied","Data":"030a0b654de36fe71c0eaad9f39c76d8175076f95b4ee71e4b3ebf9e6366a0b2"} Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.010829 4867 scope.go:117] "RemoveContainer" containerID="4310ad380c52ad5a523412176f032fae6e19e28f7f40a7ff32baeb3d84a94ee2" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.030606 4867 scope.go:117] "RemoveContainer" containerID="85aabb0edf77e4575dce0a838aad71b3a61a3bb1ffa9695d92db6f2a34c5c40f" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.046377 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fw5nc"] Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.051218 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fw5nc"] Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.054653 4867 scope.go:117] "RemoveContainer" containerID="f02484e89f9ca2cd5e50db140a566540dfb34c28e2be3ac339bbc931bc7b91cd" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.080984 4867 scope.go:117] "RemoveContainer" containerID="4310ad380c52ad5a523412176f032fae6e19e28f7f40a7ff32baeb3d84a94ee2" Oct 06 13:17:48 crc kubenswrapper[4867]: E1006 13:17:48.081747 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4310ad380c52ad5a523412176f032fae6e19e28f7f40a7ff32baeb3d84a94ee2\": container with ID starting with 4310ad380c52ad5a523412176f032fae6e19e28f7f40a7ff32baeb3d84a94ee2 not found: ID does not exist" containerID="4310ad380c52ad5a523412176f032fae6e19e28f7f40a7ff32baeb3d84a94ee2" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.081818 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4310ad380c52ad5a523412176f032fae6e19e28f7f40a7ff32baeb3d84a94ee2"} err="failed to get container status \"4310ad380c52ad5a523412176f032fae6e19e28f7f40a7ff32baeb3d84a94ee2\": rpc error: code = NotFound desc = could not find container \"4310ad380c52ad5a523412176f032fae6e19e28f7f40a7ff32baeb3d84a94ee2\": container with ID starting with 4310ad380c52ad5a523412176f032fae6e19e28f7f40a7ff32baeb3d84a94ee2 not found: ID does not exist" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.081861 4867 scope.go:117] "RemoveContainer" containerID="85aabb0edf77e4575dce0a838aad71b3a61a3bb1ffa9695d92db6f2a34c5c40f" Oct 06 13:17:48 crc kubenswrapper[4867]: E1006 13:17:48.082742 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85aabb0edf77e4575dce0a838aad71b3a61a3bb1ffa9695d92db6f2a34c5c40f\": container with ID starting with 85aabb0edf77e4575dce0a838aad71b3a61a3bb1ffa9695d92db6f2a34c5c40f not found: ID does not exist" containerID="85aabb0edf77e4575dce0a838aad71b3a61a3bb1ffa9695d92db6f2a34c5c40f" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.082805 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85aabb0edf77e4575dce0a838aad71b3a61a3bb1ffa9695d92db6f2a34c5c40f"} err="failed to get container status \"85aabb0edf77e4575dce0a838aad71b3a61a3bb1ffa9695d92db6f2a34c5c40f\": rpc error: code = NotFound desc = could not find container \"85aabb0edf77e4575dce0a838aad71b3a61a3bb1ffa9695d92db6f2a34c5c40f\": container with ID starting with 85aabb0edf77e4575dce0a838aad71b3a61a3bb1ffa9695d92db6f2a34c5c40f not found: ID does not exist" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.082848 4867 scope.go:117] "RemoveContainer" containerID="f02484e89f9ca2cd5e50db140a566540dfb34c28e2be3ac339bbc931bc7b91cd" Oct 06 13:17:48 crc kubenswrapper[4867]: E1006 13:17:48.083474 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f02484e89f9ca2cd5e50db140a566540dfb34c28e2be3ac339bbc931bc7b91cd\": container with ID starting with f02484e89f9ca2cd5e50db140a566540dfb34c28e2be3ac339bbc931bc7b91cd not found: ID does not exist" containerID="f02484e89f9ca2cd5e50db140a566540dfb34c28e2be3ac339bbc931bc7b91cd" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.083513 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02484e89f9ca2cd5e50db140a566540dfb34c28e2be3ac339bbc931bc7b91cd"} err="failed to get container status \"f02484e89f9ca2cd5e50db140a566540dfb34c28e2be3ac339bbc931bc7b91cd\": rpc error: code = NotFound desc = could not find container \"f02484e89f9ca2cd5e50db140a566540dfb34c28e2be3ac339bbc931bc7b91cd\": container with ID starting with f02484e89f9ca2cd5e50db140a566540dfb34c28e2be3ac339bbc931bc7b91cd not found: ID does not exist" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.386586 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6c5b974dc6-zhns8"] Oct 06 13:17:48 crc kubenswrapper[4867]: E1006 13:17:48.387210 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09c33a7-5fbd-4594-bebf-bcb9a5519e89" containerName="extract" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.387225 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09c33a7-5fbd-4594-bebf-bcb9a5519e89" containerName="extract" Oct 06 13:17:48 crc kubenswrapper[4867]: E1006 13:17:48.387239 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c5bde4-82ee-4ac7-9e51-c77f809776e6" containerName="extract-content" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.387245 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c5bde4-82ee-4ac7-9e51-c77f809776e6" containerName="extract-content" Oct 06 13:17:48 crc kubenswrapper[4867]: E1006 13:17:48.387278 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c5bde4-82ee-4ac7-9e51-c77f809776e6" containerName="extract-utilities" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.387284 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c5bde4-82ee-4ac7-9e51-c77f809776e6" containerName="extract-utilities" Oct 06 13:17:48 crc kubenswrapper[4867]: E1006 13:17:48.387291 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09c33a7-5fbd-4594-bebf-bcb9a5519e89" containerName="pull" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.387298 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09c33a7-5fbd-4594-bebf-bcb9a5519e89" containerName="pull" Oct 06 13:17:48 crc kubenswrapper[4867]: E1006 13:17:48.387313 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09c33a7-5fbd-4594-bebf-bcb9a5519e89" containerName="util" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.387319 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09c33a7-5fbd-4594-bebf-bcb9a5519e89" containerName="util" Oct 06 13:17:48 crc kubenswrapper[4867]: E1006 13:17:48.387330 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c5bde4-82ee-4ac7-9e51-c77f809776e6" containerName="registry-server" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.387337 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c5bde4-82ee-4ac7-9e51-c77f809776e6" containerName="registry-server" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.387491 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c5bde4-82ee-4ac7-9e51-c77f809776e6" containerName="registry-server" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.387508 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09c33a7-5fbd-4594-bebf-bcb9a5519e89" containerName="extract" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.388206 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6c5b974dc6-zhns8" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.392286 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-g4gc9" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.401176 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6c5b974dc6-zhns8"] Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.526703 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsm9b\" (UniqueName: \"kubernetes.io/projected/2ed7554d-3165-42b7-b7ac-6ad1b620e825-kube-api-access-rsm9b\") pod \"openstack-operator-controller-operator-6c5b974dc6-zhns8\" (UID: \"2ed7554d-3165-42b7-b7ac-6ad1b620e825\") " pod="openstack-operators/openstack-operator-controller-operator-6c5b974dc6-zhns8" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.628704 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsm9b\" (UniqueName: \"kubernetes.io/projected/2ed7554d-3165-42b7-b7ac-6ad1b620e825-kube-api-access-rsm9b\") pod \"openstack-operator-controller-operator-6c5b974dc6-zhns8\" (UID: \"2ed7554d-3165-42b7-b7ac-6ad1b620e825\") " pod="openstack-operators/openstack-operator-controller-operator-6c5b974dc6-zhns8" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.648365 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsm9b\" (UniqueName: \"kubernetes.io/projected/2ed7554d-3165-42b7-b7ac-6ad1b620e825-kube-api-access-rsm9b\") pod \"openstack-operator-controller-operator-6c5b974dc6-zhns8\" (UID: \"2ed7554d-3165-42b7-b7ac-6ad1b620e825\") " pod="openstack-operators/openstack-operator-controller-operator-6c5b974dc6-zhns8" Oct 06 13:17:48 crc kubenswrapper[4867]: I1006 13:17:48.707944 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6c5b974dc6-zhns8" Oct 06 13:17:49 crc kubenswrapper[4867]: I1006 13:17:49.159666 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6c5b974dc6-zhns8"] Oct 06 13:17:49 crc kubenswrapper[4867]: I1006 13:17:49.232616 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c5bde4-82ee-4ac7-9e51-c77f809776e6" path="/var/lib/kubelet/pods/a8c5bde4-82ee-4ac7-9e51-c77f809776e6/volumes" Oct 06 13:17:50 crc kubenswrapper[4867]: I1006 13:17:50.035089 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6c5b974dc6-zhns8" event={"ID":"2ed7554d-3165-42b7-b7ac-6ad1b620e825","Type":"ContainerStarted","Data":"1e8883a5e8f5466eded2b6f5a7e7db87fc4c3898dfb9981e48e49bc1e5b36669"} Oct 06 13:17:54 crc kubenswrapper[4867]: I1006 13:17:54.065616 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6c5b974dc6-zhns8" event={"ID":"2ed7554d-3165-42b7-b7ac-6ad1b620e825","Type":"ContainerStarted","Data":"011236067ab9cfe4c61be1bc5ea4680ca23eede3978e7500dee6378b99353e0e"} Oct 06 13:17:56 crc kubenswrapper[4867]: I1006 13:17:56.081331 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6c5b974dc6-zhns8" event={"ID":"2ed7554d-3165-42b7-b7ac-6ad1b620e825","Type":"ContainerStarted","Data":"8f8232f8232124823f6481cac4dab763aeeda2436124b985e5059e8a1da2c516"} Oct 06 13:17:56 crc kubenswrapper[4867]: I1006 13:17:56.081704 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6c5b974dc6-zhns8" Oct 06 13:17:56 crc kubenswrapper[4867]: I1006 13:17:56.115961 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6c5b974dc6-zhns8" podStartSLOduration=2.00867824 podStartE2EDuration="8.115936214s" podCreationTimestamp="2025-10-06 13:17:48 +0000 UTC" firstStartedPulling="2025-10-06 13:17:49.169659163 +0000 UTC m=+848.627607307" lastFinishedPulling="2025-10-06 13:17:55.276917137 +0000 UTC m=+854.734865281" observedRunningTime="2025-10-06 13:17:56.110623789 +0000 UTC m=+855.568571933" watchObservedRunningTime="2025-10-06 13:17:56.115936214 +0000 UTC m=+855.573884358" Oct 06 13:17:58 crc kubenswrapper[4867]: I1006 13:17:58.712000 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6c5b974dc6-zhns8" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.244475 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-4tgfn"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.248442 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-4tgfn" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.252310 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ks4j8" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.260345 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df2bl\" (UniqueName: \"kubernetes.io/projected/c0e4bb9c-5d2f-4a0f-9cb8-9133336c2536-kube-api-access-df2bl\") pod \"barbican-operator-controller-manager-58c4cd55f4-4tgfn\" (UID: \"c0e4bb9c-5d2f-4a0f-9cb8-9133336c2536\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-4tgfn" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.262382 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-4tgfn"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.274635 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v72nf"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.276207 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v72nf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.280666 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v72nf"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.282115 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-ht29f" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.295179 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-84drf"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.296487 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-84drf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.318395 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-pf44d" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.339209 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-s4qrw"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.340838 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s4qrw" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.344176 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-84drf"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.345284 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-pjtlm" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.363036 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdmlx\" (UniqueName: \"kubernetes.io/projected/7050df56-39f0-4962-878b-7e9c498d86d4-kube-api-access-zdmlx\") pod \"glance-operator-controller-manager-5dc44df7d5-84drf\" (UID: \"7050df56-39f0-4962-878b-7e9c498d86d4\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-84drf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.363100 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df2bl\" (UniqueName: \"kubernetes.io/projected/c0e4bb9c-5d2f-4a0f-9cb8-9133336c2536-kube-api-access-df2bl\") pod \"barbican-operator-controller-manager-58c4cd55f4-4tgfn\" (UID: \"c0e4bb9c-5d2f-4a0f-9cb8-9133336c2536\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-4tgfn" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.363167 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjpv4\" (UniqueName: \"kubernetes.io/projected/3c3a38a7-d3a0-4c01-aae9-645d5dada80f-kube-api-access-tjpv4\") pod \"cinder-operator-controller-manager-7d4d4f8d-v72nf\" (UID: \"3c3a38a7-d3a0-4c01-aae9-645d5dada80f\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v72nf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.363211 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvq2f\" (UniqueName: \"kubernetes.io/projected/3b46e0ea-7a30-45ab-99cc-d36efd3fc75e-kube-api-access-rvq2f\") pod \"designate-operator-controller-manager-75dfd9b554-s4qrw\" (UID: \"3b46e0ea-7a30-45ab-99cc-d36efd3fc75e\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s4qrw" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.367639 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-jxt5n"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.368784 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jxt5n" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.387184 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-2sx62" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.423745 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df2bl\" (UniqueName: \"kubernetes.io/projected/c0e4bb9c-5d2f-4a0f-9cb8-9133336c2536-kube-api-access-df2bl\") pod \"barbican-operator-controller-manager-58c4cd55f4-4tgfn\" (UID: \"c0e4bb9c-5d2f-4a0f-9cb8-9133336c2536\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-4tgfn" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.423845 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-v222m"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.427213 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-v222m" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.445077 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-r894s" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.457047 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-jxt5n"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.463538 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.464584 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjpv4\" (UniqueName: \"kubernetes.io/projected/3c3a38a7-d3a0-4c01-aae9-645d5dada80f-kube-api-access-tjpv4\") pod \"cinder-operator-controller-manager-7d4d4f8d-v72nf\" (UID: \"3c3a38a7-d3a0-4c01-aae9-645d5dada80f\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v72nf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.464665 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvq2f\" (UniqueName: \"kubernetes.io/projected/3b46e0ea-7a30-45ab-99cc-d36efd3fc75e-kube-api-access-rvq2f\") pod \"designate-operator-controller-manager-75dfd9b554-s4qrw\" (UID: \"3b46e0ea-7a30-45ab-99cc-d36efd3fc75e\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s4qrw" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.464699 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdmlx\" (UniqueName: \"kubernetes.io/projected/7050df56-39f0-4962-878b-7e9c498d86d4-kube-api-access-zdmlx\") pod \"glance-operator-controller-manager-5dc44df7d5-84drf\" (UID: \"7050df56-39f0-4962-878b-7e9c498d86d4\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-84drf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.464720 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxhjg\" (UniqueName: \"kubernetes.io/projected/901a13c6-49ea-4126-8b2d-7c7901720f05-kube-api-access-vxhjg\") pod \"horizon-operator-controller-manager-76d5b87f47-v222m\" (UID: \"901a13c6-49ea-4126-8b2d-7c7901720f05\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-v222m" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.464785 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42p6m\" (UniqueName: \"kubernetes.io/projected/5d14ff34-79c1-467d-99b0-35202d1650bb-kube-api-access-42p6m\") pod \"heat-operator-controller-manager-54b4974c45-jxt5n\" (UID: \"5d14ff34-79c1-467d-99b0-35202d1650bb\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jxt5n" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.465130 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.469217 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5sdrx" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.469780 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.476860 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-s4qrw"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.500184 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdmlx\" (UniqueName: \"kubernetes.io/projected/7050df56-39f0-4962-878b-7e9c498d86d4-kube-api-access-zdmlx\") pod \"glance-operator-controller-manager-5dc44df7d5-84drf\" (UID: \"7050df56-39f0-4962-878b-7e9c498d86d4\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-84drf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.500820 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-v222m"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.501666 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvq2f\" (UniqueName: \"kubernetes.io/projected/3b46e0ea-7a30-45ab-99cc-d36efd3fc75e-kube-api-access-rvq2f\") pod \"designate-operator-controller-manager-75dfd9b554-s4qrw\" (UID: \"3b46e0ea-7a30-45ab-99cc-d36efd3fc75e\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s4qrw" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.504094 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.506612 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjpv4\" (UniqueName: \"kubernetes.io/projected/3c3a38a7-d3a0-4c01-aae9-645d5dada80f-kube-api-access-tjpv4\") pod \"cinder-operator-controller-manager-7d4d4f8d-v72nf\" (UID: \"3c3a38a7-d3a0-4c01-aae9-645d5dada80f\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v72nf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.519540 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-qhnpp"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.520853 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-qhnpp" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.525267 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-znm67" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.537693 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-qhnpp"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.566483 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpcll\" (UniqueName: \"kubernetes.io/projected/92cf840d-e92d-4212-8d63-2d623040ca46-kube-api-access-zpcll\") pod \"ironic-operator-controller-manager-649675d675-qhnpp\" (UID: \"92cf840d-e92d-4212-8d63-2d623040ca46\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-qhnpp" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.566583 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxhjg\" (UniqueName: \"kubernetes.io/projected/901a13c6-49ea-4126-8b2d-7c7901720f05-kube-api-access-vxhjg\") pod \"horizon-operator-controller-manager-76d5b87f47-v222m\" (UID: \"901a13c6-49ea-4126-8b2d-7c7901720f05\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-v222m" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.566668 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/311ba4cb-158b-41f4-ada4-4fed1c0f2ede-cert\") pod \"infra-operator-controller-manager-658588b8c9-n64zf\" (UID: \"311ba4cb-158b-41f4-ada4-4fed1c0f2ede\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.566729 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42p6m\" (UniqueName: \"kubernetes.io/projected/5d14ff34-79c1-467d-99b0-35202d1650bb-kube-api-access-42p6m\") pod \"heat-operator-controller-manager-54b4974c45-jxt5n\" (UID: \"5d14ff34-79c1-467d-99b0-35202d1650bb\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jxt5n" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.566865 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrjvp\" (UniqueName: \"kubernetes.io/projected/311ba4cb-158b-41f4-ada4-4fed1c0f2ede-kube-api-access-mrjvp\") pod \"infra-operator-controller-manager-658588b8c9-n64zf\" (UID: \"311ba4cb-158b-41f4-ada4-4fed1c0f2ede\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.575946 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-fwbwb"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.577309 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-fwbwb" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.579581 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-4tgfn" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.584430 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-j6jfl" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.600498 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-njdr6"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.601903 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-njdr6" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.605814 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v72nf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.612192 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-tbggq" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.613374 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-njdr6"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.642275 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42p6m\" (UniqueName: \"kubernetes.io/projected/5d14ff34-79c1-467d-99b0-35202d1650bb-kube-api-access-42p6m\") pod \"heat-operator-controller-manager-54b4974c45-jxt5n\" (UID: \"5d14ff34-79c1-467d-99b0-35202d1650bb\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jxt5n" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.647185 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-fwbwb"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.649787 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxhjg\" (UniqueName: \"kubernetes.io/projected/901a13c6-49ea-4126-8b2d-7c7901720f05-kube-api-access-vxhjg\") pod \"horizon-operator-controller-manager-76d5b87f47-v222m\" (UID: \"901a13c6-49ea-4126-8b2d-7c7901720f05\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-v222m" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.658560 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-84drf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.665343 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.667067 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.667792 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpcll\" (UniqueName: \"kubernetes.io/projected/92cf840d-e92d-4212-8d63-2d623040ca46-kube-api-access-zpcll\") pod \"ironic-operator-controller-manager-649675d675-qhnpp\" (UID: \"92cf840d-e92d-4212-8d63-2d623040ca46\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-qhnpp" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.667858 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/311ba4cb-158b-41f4-ada4-4fed1c0f2ede-cert\") pod \"infra-operator-controller-manager-658588b8c9-n64zf\" (UID: \"311ba4cb-158b-41f4-ada4-4fed1c0f2ede\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.667920 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml8gx\" (UniqueName: \"kubernetes.io/projected/831b6b11-3e22-4ae1-aa26-1ccb9a6bacb7-kube-api-access-ml8gx\") pod \"manila-operator-controller-manager-65d89cfd9f-njdr6\" (UID: \"831b6b11-3e22-4ae1-aa26-1ccb9a6bacb7\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-njdr6" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.667976 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrjvp\" (UniqueName: \"kubernetes.io/projected/311ba4cb-158b-41f4-ada4-4fed1c0f2ede-kube-api-access-mrjvp\") pod \"infra-operator-controller-manager-658588b8c9-n64zf\" (UID: \"311ba4cb-158b-41f4-ada4-4fed1c0f2ede\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.668005 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwz8b\" (UniqueName: \"kubernetes.io/projected/21147e7d-1dd6-4a90-ab7a-f923f014a281-kube-api-access-gwz8b\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-fwbwb\" (UID: \"21147e7d-1dd6-4a90-ab7a-f923f014a281\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-fwbwb" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.668055 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-7jwqc"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.671480 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-7jwqc" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.673696 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s4qrw" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.680342 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-bwm2k" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.680636 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-t9qld" Oct 06 13:18:35 crc kubenswrapper[4867]: E1006 13:18:35.682615 4867 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 06 13:18:35 crc kubenswrapper[4867]: E1006 13:18:35.682760 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/311ba4cb-158b-41f4-ada4-4fed1c0f2ede-cert podName:311ba4cb-158b-41f4-ada4-4fed1c0f2ede nodeName:}" failed. No retries permitted until 2025-10-06 13:18:36.182736982 +0000 UTC m=+895.640685126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/311ba4cb-158b-41f4-ada4-4fed1c0f2ede-cert") pod "infra-operator-controller-manager-658588b8c9-n64zf" (UID: "311ba4cb-158b-41f4-ada4-4fed1c0f2ede") : secret "infra-operator-webhook-server-cert" not found Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.718954 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jxt5n" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.738445 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrjvp\" (UniqueName: \"kubernetes.io/projected/311ba4cb-158b-41f4-ada4-4fed1c0f2ede-kube-api-access-mrjvp\") pod \"infra-operator-controller-manager-658588b8c9-n64zf\" (UID: \"311ba4cb-158b-41f4-ada4-4fed1c0f2ede\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.745356 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.770144 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml8gx\" (UniqueName: \"kubernetes.io/projected/831b6b11-3e22-4ae1-aa26-1ccb9a6bacb7-kube-api-access-ml8gx\") pod \"manila-operator-controller-manager-65d89cfd9f-njdr6\" (UID: \"831b6b11-3e22-4ae1-aa26-1ccb9a6bacb7\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-njdr6" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.770197 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c8v6\" (UniqueName: \"kubernetes.io/projected/6c53454b-e984-4366-8bd1-3c4eb10fb1c8-kube-api-access-6c8v6\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg\" (UID: \"6c53454b-e984-4366-8bd1-3c4eb10fb1c8\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.770240 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwz8b\" (UniqueName: \"kubernetes.io/projected/21147e7d-1dd6-4a90-ab7a-f923f014a281-kube-api-access-gwz8b\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-fwbwb\" (UID: \"21147e7d-1dd6-4a90-ab7a-f923f014a281\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-fwbwb" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.770306 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72lhs\" (UniqueName: \"kubernetes.io/projected/3d2faf90-2410-459e-a8a3-668296923f2e-kube-api-access-72lhs\") pod \"neutron-operator-controller-manager-8d984cc4d-7jwqc\" (UID: \"3d2faf90-2410-459e-a8a3-668296923f2e\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-7jwqc" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.770605 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpcll\" (UniqueName: \"kubernetes.io/projected/92cf840d-e92d-4212-8d63-2d623040ca46-kube-api-access-zpcll\") pod \"ironic-operator-controller-manager-649675d675-qhnpp\" (UID: \"92cf840d-e92d-4212-8d63-2d623040ca46\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-qhnpp" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.791700 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-7jwqc"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.792721 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-v222m" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.799219 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml8gx\" (UniqueName: \"kubernetes.io/projected/831b6b11-3e22-4ae1-aa26-1ccb9a6bacb7-kube-api-access-ml8gx\") pod \"manila-operator-controller-manager-65d89cfd9f-njdr6\" (UID: \"831b6b11-3e22-4ae1-aa26-1ccb9a6bacb7\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-njdr6" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.805998 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.808300 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.810669 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwz8b\" (UniqueName: \"kubernetes.io/projected/21147e7d-1dd6-4a90-ab7a-f923f014a281-kube-api-access-gwz8b\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-fwbwb\" (UID: \"21147e7d-1dd6-4a90-ab7a-f923f014a281\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-fwbwb" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.815741 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-kdzcc" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.869976 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.880432 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-v6b9l"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.882143 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-v6b9l" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.885435 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-qhnpp" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.886657 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-vclv6" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.896216 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c8v6\" (UniqueName: \"kubernetes.io/projected/6c53454b-e984-4366-8bd1-3c4eb10fb1c8-kube-api-access-6c8v6\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg\" (UID: \"6c53454b-e984-4366-8bd1-3c4eb10fb1c8\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.896299 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72lhs\" (UniqueName: \"kubernetes.io/projected/3d2faf90-2410-459e-a8a3-668296923f2e-kube-api-access-72lhs\") pod \"neutron-operator-controller-manager-8d984cc4d-7jwqc\" (UID: \"3d2faf90-2410-459e-a8a3-668296923f2e\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-7jwqc" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.896340 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frgfj\" (UniqueName: \"kubernetes.io/projected/15792c9d-8f60-4b13-8623-55c9a6a7319b-kube-api-access-frgfj\") pod \"nova-operator-controller-manager-7c7fc454ff-kpx9k\" (UID: \"15792c9d-8f60-4b13-8623-55c9a6a7319b\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.899716 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-v6b9l"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.907381 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.922375 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72lhs\" (UniqueName: \"kubernetes.io/projected/3d2faf90-2410-459e-a8a3-668296923f2e-kube-api-access-72lhs\") pod \"neutron-operator-controller-manager-8d984cc4d-7jwqc\" (UID: \"3d2faf90-2410-459e-a8a3-668296923f2e\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-7jwqc" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.925454 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.927433 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c8v6\" (UniqueName: \"kubernetes.io/projected/6c53454b-e984-4366-8bd1-3c4eb10fb1c8-kube-api-access-6c8v6\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg\" (UID: \"6c53454b-e984-4366-8bd1-3c4eb10fb1c8\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.927498 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.934791 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.938956 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-zpms8" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.939166 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.939285 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-kzdwj" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.944156 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-fwbwb" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.960434 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt"] Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.998943 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frgfj\" (UniqueName: \"kubernetes.io/projected/15792c9d-8f60-4b13-8623-55c9a6a7319b-kube-api-access-frgfj\") pod \"nova-operator-controller-manager-7c7fc454ff-kpx9k\" (UID: \"15792c9d-8f60-4b13-8623-55c9a6a7319b\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.999005 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbe49bb4-18db-473e-b57c-2047bbbe2405-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt\" (UID: \"dbe49bb4-18db-473e-b57c-2047bbbe2405\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.999078 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgvfw\" (UniqueName: \"kubernetes.io/projected/dbe49bb4-18db-473e-b57c-2047bbbe2405-kube-api-access-vgvfw\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt\" (UID: \"dbe49bb4-18db-473e-b57c-2047bbbe2405\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.999100 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q8x8\" (UniqueName: \"kubernetes.io/projected/95e501d6-fddf-4baa-befd-25c5c5f3303e-kube-api-access-6q8x8\") pod \"octavia-operator-controller-manager-7468f855d8-v6b9l\" (UID: \"95e501d6-fddf-4baa-befd-25c5c5f3303e\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-v6b9l" Oct 06 13:18:35 crc kubenswrapper[4867]: I1006 13:18:35.999141 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8jlh\" (UniqueName: \"kubernetes.io/projected/efcff7d5-4481-45ea-b693-ebc63e9f1458-kube-api-access-v8jlh\") pod \"ovn-operator-controller-manager-6d8b6f9b9-4lsh6\" (UID: \"efcff7d5-4481-45ea-b693-ebc63e9f1458\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.000158 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.005988 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-njdr6" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.041472 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-922cb"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.045742 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-922cb" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.047913 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.050829 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-njtvx" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.059731 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frgfj\" (UniqueName: \"kubernetes.io/projected/15792c9d-8f60-4b13-8623-55c9a6a7319b-kube-api-access-frgfj\") pod \"nova-operator-controller-manager-7c7fc454ff-kpx9k\" (UID: \"15792c9d-8f60-4b13-8623-55c9a6a7319b\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.065367 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-922cb"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.073619 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.086637 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.086703 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-7jwqc" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.089829 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-xvjxf" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.090052 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.098068 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wrrdj"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.102208 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgvfw\" (UniqueName: \"kubernetes.io/projected/dbe49bb4-18db-473e-b57c-2047bbbe2405-kube-api-access-vgvfw\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt\" (UID: \"dbe49bb4-18db-473e-b57c-2047bbbe2405\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.102276 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q8x8\" (UniqueName: \"kubernetes.io/projected/95e501d6-fddf-4baa-befd-25c5c5f3303e-kube-api-access-6q8x8\") pod \"octavia-operator-controller-manager-7468f855d8-v6b9l\" (UID: \"95e501d6-fddf-4baa-befd-25c5c5f3303e\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-v6b9l" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.102335 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8jlh\" (UniqueName: \"kubernetes.io/projected/efcff7d5-4481-45ea-b693-ebc63e9f1458-kube-api-access-v8jlh\") pod \"ovn-operator-controller-manager-6d8b6f9b9-4lsh6\" (UID: \"efcff7d5-4481-45ea-b693-ebc63e9f1458\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.102361 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qqfs\" (UniqueName: \"kubernetes.io/projected/937696b7-f234-4e2e-97b3-9ef0f2bf0a90-kube-api-access-5qqfs\") pod \"swift-operator-controller-manager-6859f9b676-48lgc\" (UID: \"937696b7-f234-4e2e-97b3-9ef0f2bf0a90\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.102438 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2rcp\" (UniqueName: \"kubernetes.io/projected/05580142-d01c-470a-afcb-da956c1f6d36-kube-api-access-w2rcp\") pod \"placement-operator-controller-manager-54689d9f88-922cb\" (UID: \"05580142-d01c-470a-afcb-da956c1f6d36\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-922cb" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.102502 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbe49bb4-18db-473e-b57c-2047bbbe2405-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt\" (UID: \"dbe49bb4-18db-473e-b57c-2047bbbe2405\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" Oct 06 13:18:36 crc kubenswrapper[4867]: E1006 13:18:36.102655 4867 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 13:18:36 crc kubenswrapper[4867]: E1006 13:18:36.102720 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbe49bb4-18db-473e-b57c-2047bbbe2405-cert podName:dbe49bb4-18db-473e-b57c-2047bbbe2405 nodeName:}" failed. No retries permitted until 2025-10-06 13:18:36.602701443 +0000 UTC m=+896.060649577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dbe49bb4-18db-473e-b57c-2047bbbe2405-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" (UID: "dbe49bb4-18db-473e-b57c-2047bbbe2405") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.103926 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wrrdj" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.105769 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-xqqsq" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.115616 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.116975 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.130039 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wrrdj"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.143426 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mzk58" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.153983 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgvfw\" (UniqueName: \"kubernetes.io/projected/dbe49bb4-18db-473e-b57c-2047bbbe2405-kube-api-access-vgvfw\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt\" (UID: \"dbe49bb4-18db-473e-b57c-2047bbbe2405\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.156320 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8jlh\" (UniqueName: \"kubernetes.io/projected/efcff7d5-4481-45ea-b693-ebc63e9f1458-kube-api-access-v8jlh\") pod \"ovn-operator-controller-manager-6d8b6f9b9-4lsh6\" (UID: \"efcff7d5-4481-45ea-b693-ebc63e9f1458\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.157440 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.161285 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55dcdc7cc-z7lp5"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.162605 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-55dcdc7cc-z7lp5" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.164500 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-jdqp2" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.167015 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q8x8\" (UniqueName: \"kubernetes.io/projected/95e501d6-fddf-4baa-befd-25c5c5f3303e-kube-api-access-6q8x8\") pod \"octavia-operator-controller-manager-7468f855d8-v6b9l\" (UID: \"95e501d6-fddf-4baa-befd-25c5c5f3303e\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-v6b9l" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.192908 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.213811 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hscnd\" (UniqueName: \"kubernetes.io/projected/d7b781c6-8500-43b4-884d-e67aadad8518-kube-api-access-hscnd\") pod \"telemetry-operator-controller-manager-5d4d74dd89-wrrdj\" (UID: \"d7b781c6-8500-43b4-884d-e67aadad8518\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wrrdj" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.213876 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w422\" (UniqueName: \"kubernetes.io/projected/b099322d-539c-4c48-9344-62e1fec437ab-kube-api-access-6w422\") pod \"watcher-operator-controller-manager-55dcdc7cc-z7lp5\" (UID: \"b099322d-539c-4c48-9344-62e1fec437ab\") " pod="openstack-operators/watcher-operator-controller-manager-55dcdc7cc-z7lp5" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.213986 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/311ba4cb-158b-41f4-ada4-4fed1c0f2ede-cert\") pod \"infra-operator-controller-manager-658588b8c9-n64zf\" (UID: \"311ba4cb-158b-41f4-ada4-4fed1c0f2ede\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.214051 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qqfs\" (UniqueName: \"kubernetes.io/projected/937696b7-f234-4e2e-97b3-9ef0f2bf0a90-kube-api-access-5qqfs\") pod \"swift-operator-controller-manager-6859f9b676-48lgc\" (UID: \"937696b7-f234-4e2e-97b3-9ef0f2bf0a90\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.214121 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2rcp\" (UniqueName: \"kubernetes.io/projected/05580142-d01c-470a-afcb-da956c1f6d36-kube-api-access-w2rcp\") pod \"placement-operator-controller-manager-54689d9f88-922cb\" (UID: \"05580142-d01c-470a-afcb-da956c1f6d36\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-922cb" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.214146 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87w9x\" (UniqueName: \"kubernetes.io/projected/9d369f1b-62ae-4b24-8287-fd62b21122ce-kube-api-access-87w9x\") pod \"test-operator-controller-manager-5cd5cb47d7-tvpx8\" (UID: \"9d369f1b-62ae-4b24-8287-fd62b21122ce\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.217974 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-v6b9l" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.222795 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55dcdc7cc-z7lp5"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.225325 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/311ba4cb-158b-41f4-ada4-4fed1c0f2ede-cert\") pod \"infra-operator-controller-manager-658588b8c9-n64zf\" (UID: \"311ba4cb-158b-41f4-ada4-4fed1c0f2ede\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.246301 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2rcp\" (UniqueName: \"kubernetes.io/projected/05580142-d01c-470a-afcb-da956c1f6d36-kube-api-access-w2rcp\") pod \"placement-operator-controller-manager-54689d9f88-922cb\" (UID: \"05580142-d01c-470a-afcb-da956c1f6d36\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-922cb" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.259180 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qqfs\" (UniqueName: \"kubernetes.io/projected/937696b7-f234-4e2e-97b3-9ef0f2bf0a90-kube-api-access-5qqfs\") pod \"swift-operator-controller-manager-6859f9b676-48lgc\" (UID: \"937696b7-f234-4e2e-97b3-9ef0f2bf0a90\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.318606 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87w9x\" (UniqueName: \"kubernetes.io/projected/9d369f1b-62ae-4b24-8287-fd62b21122ce-kube-api-access-87w9x\") pod \"test-operator-controller-manager-5cd5cb47d7-tvpx8\" (UID: \"9d369f1b-62ae-4b24-8287-fd62b21122ce\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.318680 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hscnd\" (UniqueName: \"kubernetes.io/projected/d7b781c6-8500-43b4-884d-e67aadad8518-kube-api-access-hscnd\") pod \"telemetry-operator-controller-manager-5d4d74dd89-wrrdj\" (UID: \"d7b781c6-8500-43b4-884d-e67aadad8518\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wrrdj" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.318711 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w422\" (UniqueName: \"kubernetes.io/projected/b099322d-539c-4c48-9344-62e1fec437ab-kube-api-access-6w422\") pod \"watcher-operator-controller-manager-55dcdc7cc-z7lp5\" (UID: \"b099322d-539c-4c48-9344-62e1fec437ab\") " pod="openstack-operators/watcher-operator-controller-manager-55dcdc7cc-z7lp5" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.343431 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w422\" (UniqueName: \"kubernetes.io/projected/b099322d-539c-4c48-9344-62e1fec437ab-kube-api-access-6w422\") pod \"watcher-operator-controller-manager-55dcdc7cc-z7lp5\" (UID: \"b099322d-539c-4c48-9344-62e1fec437ab\") " pod="openstack-operators/watcher-operator-controller-manager-55dcdc7cc-z7lp5" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.348142 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hscnd\" (UniqueName: \"kubernetes.io/projected/d7b781c6-8500-43b4-884d-e67aadad8518-kube-api-access-hscnd\") pod \"telemetry-operator-controller-manager-5d4d74dd89-wrrdj\" (UID: \"d7b781c6-8500-43b4-884d-e67aadad8518\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wrrdj" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.356995 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87w9x\" (UniqueName: \"kubernetes.io/projected/9d369f1b-62ae-4b24-8287-fd62b21122ce-kube-api-access-87w9x\") pod \"test-operator-controller-manager-5cd5cb47d7-tvpx8\" (UID: \"9d369f1b-62ae-4b24-8287-fd62b21122ce\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.373393 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.374725 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.378586 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-2w66z" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.378852 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.383519 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.401858 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.403020 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.408534 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.409519 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6zh2j" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.420523 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56xn5\" (UniqueName: \"kubernetes.io/projected/94790623-543f-45ee-9579-6e837ce82cd8-kube-api-access-56xn5\") pod \"openstack-operator-controller-manager-66dbf6f685-4srz5\" (UID: \"94790623-543f-45ee-9579-6e837ce82cd8\") " pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.420651 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94790623-543f-45ee-9579-6e837ce82cd8-cert\") pod \"openstack-operator-controller-manager-66dbf6f685-4srz5\" (UID: \"94790623-543f-45ee-9579-6e837ce82cd8\") " pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.446119 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.447296 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp"] Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.474765 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-922cb" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.486765 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.510085 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wrrdj" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.527152 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94790623-543f-45ee-9579-6e837ce82cd8-cert\") pod \"openstack-operator-controller-manager-66dbf6f685-4srz5\" (UID: \"94790623-543f-45ee-9579-6e837ce82cd8\") " pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.527307 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2lc6\" (UniqueName: \"kubernetes.io/projected/bdb84464-dbf1-4dfc-9a87-b3dde7d0fcc3-kube-api-access-t2lc6\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp\" (UID: \"bdb84464-dbf1-4dfc-9a87-b3dde7d0fcc3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.527359 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56xn5\" (UniqueName: \"kubernetes.io/projected/94790623-543f-45ee-9579-6e837ce82cd8-kube-api-access-56xn5\") pod \"openstack-operator-controller-manager-66dbf6f685-4srz5\" (UID: \"94790623-543f-45ee-9579-6e837ce82cd8\") " pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" Oct 06 13:18:36 crc kubenswrapper[4867]: E1006 13:18:36.527808 4867 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 06 13:18:36 crc kubenswrapper[4867]: E1006 13:18:36.527985 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94790623-543f-45ee-9579-6e837ce82cd8-cert podName:94790623-543f-45ee-9579-6e837ce82cd8 nodeName:}" failed. No retries permitted until 2025-10-06 13:18:37.027951879 +0000 UTC m=+896.485900023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94790623-543f-45ee-9579-6e837ce82cd8-cert") pod "openstack-operator-controller-manager-66dbf6f685-4srz5" (UID: "94790623-543f-45ee-9579-6e837ce82cd8") : secret "webhook-server-cert" not found Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.540936 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.562055 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56xn5\" (UniqueName: \"kubernetes.io/projected/94790623-543f-45ee-9579-6e837ce82cd8-kube-api-access-56xn5\") pod \"openstack-operator-controller-manager-66dbf6f685-4srz5\" (UID: \"94790623-543f-45ee-9579-6e837ce82cd8\") " pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.587745 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-55dcdc7cc-z7lp5" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.628384 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbe49bb4-18db-473e-b57c-2047bbbe2405-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt\" (UID: \"dbe49bb4-18db-473e-b57c-2047bbbe2405\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.628453 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2lc6\" (UniqueName: \"kubernetes.io/projected/bdb84464-dbf1-4dfc-9a87-b3dde7d0fcc3-kube-api-access-t2lc6\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp\" (UID: \"bdb84464-dbf1-4dfc-9a87-b3dde7d0fcc3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.642154 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbe49bb4-18db-473e-b57c-2047bbbe2405-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt\" (UID: \"dbe49bb4-18db-473e-b57c-2047bbbe2405\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.657210 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2lc6\" (UniqueName: \"kubernetes.io/projected/bdb84464-dbf1-4dfc-9a87-b3dde7d0fcc3-kube-api-access-t2lc6\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp\" (UID: \"bdb84464-dbf1-4dfc-9a87-b3dde7d0fcc3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.861370 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" Oct 06 13:18:36 crc kubenswrapper[4867]: I1006 13:18:36.881417 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp" Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.037234 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94790623-543f-45ee-9579-6e837ce82cd8-cert\") pod \"openstack-operator-controller-manager-66dbf6f685-4srz5\" (UID: \"94790623-543f-45ee-9579-6e837ce82cd8\") " pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.054613 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94790623-543f-45ee-9579-6e837ce82cd8-cert\") pod \"openstack-operator-controller-manager-66dbf6f685-4srz5\" (UID: \"94790623-543f-45ee-9579-6e837ce82cd8\") " pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.145641 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.236945 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v72nf"] Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.252877 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.289503 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-4tgfn"] Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.427901 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-84drf"] Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.434534 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-jxt5n"] Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.459299 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-s4qrw"] Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.481153 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v72nf" event={"ID":"3c3a38a7-d3a0-4c01-aae9-645d5dada80f","Type":"ContainerStarted","Data":"902994f8779bf272a79a678d5cfe742883f3d7fd88d92e35be7ac1ffd2d3a43f"} Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.486133 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-4tgfn" event={"ID":"c0e4bb9c-5d2f-4a0f-9cb8-9133336c2536","Type":"ContainerStarted","Data":"7b14e193392800c361a053f82080257cab38b4709bedd18d76a2e94dc6315ebe"} Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.898488 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-qhnpp"] Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.911658 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-njdr6"] Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.918378 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-v222m"] Oct 06 13:18:37 crc kubenswrapper[4867]: W1006 13:18:37.933990 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92cf840d_e92d_4212_8d63_2d623040ca46.slice/crio-e6ddf09ead7f9836f10b2750022a4d6cc280f0aa66baeed559b9a2812eec687e WatchSource:0}: Error finding container e6ddf09ead7f9836f10b2750022a4d6cc280f0aa66baeed559b9a2812eec687e: Status 404 returned error can't find the container with id e6ddf09ead7f9836f10b2750022a4d6cc280f0aa66baeed559b9a2812eec687e Oct 06 13:18:37 crc kubenswrapper[4867]: W1006 13:18:37.941782 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod831b6b11_3e22_4ae1_aa26_1ccb9a6bacb7.slice/crio-903414ce0ce4421a917c3c0aca88bf095774c5ac250a4062d63c345ca5ffc768 WatchSource:0}: Error finding container 903414ce0ce4421a917c3c0aca88bf095774c5ac250a4062d63c345ca5ffc768: Status 404 returned error can't find the container with id 903414ce0ce4421a917c3c0aca88bf095774c5ac250a4062d63c345ca5ffc768 Oct 06 13:18:37 crc kubenswrapper[4867]: W1006 13:18:37.943202 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod901a13c6_49ea_4126_8b2d_7c7901720f05.slice/crio-8c7b5c9508e3f3452009f9261be17fd12538271615de9878c6425330f90577e1 WatchSource:0}: Error finding container 8c7b5c9508e3f3452009f9261be17fd12538271615de9878c6425330f90577e1: Status 404 returned error can't find the container with id 8c7b5c9508e3f3452009f9261be17fd12538271615de9878c6425330f90577e1 Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.946486 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wrrdj"] Oct 06 13:18:37 crc kubenswrapper[4867]: I1006 13:18:37.986385 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8"] Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.005240 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k"] Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.011779 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6"] Oct 06 13:18:38 crc kubenswrapper[4867]: W1006 13:18:38.022683 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d2faf90_2410_459e_a8a3_668296923f2e.slice/crio-eb2a8a3a83e791d92157e6cec9bc9843614d0c80abcd9870634d3003edb87064 WatchSource:0}: Error finding container eb2a8a3a83e791d92157e6cec9bc9843614d0c80abcd9870634d3003edb87064: Status 404 returned error can't find the container with id eb2a8a3a83e791d92157e6cec9bc9843614d0c80abcd9870634d3003edb87064 Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.024867 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf"] Oct 06 13:18:38 crc kubenswrapper[4867]: W1006 13:18:38.027979 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95e501d6_fddf_4baa_befd_25c5c5f3303e.slice/crio-dd0706e6ecf0dd59aeb00f955c2966e32dea48ad9fd273b0c96707ce5fbab00c WatchSource:0}: Error finding container dd0706e6ecf0dd59aeb00f955c2966e32dea48ad9fd273b0c96707ce5fbab00c: Status 404 returned error can't find the container with id dd0706e6ecf0dd59aeb00f955c2966e32dea48ad9fd273b0c96707ce5fbab00c Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.033652 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-7jwqc"] Oct 06 13:18:38 crc kubenswrapper[4867]: W1006 13:18:38.038949 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15792c9d_8f60_4b13_8623_55c9a6a7319b.slice/crio-a579c5c5276e0950c3f82cba683bcbefa52c76c0d540c350e0840f9d80e761ca WatchSource:0}: Error finding container a579c5c5276e0950c3f82cba683bcbefa52c76c0d540c350e0840f9d80e761ca: Status 404 returned error can't find the container with id a579c5c5276e0950c3f82cba683bcbefa52c76c0d540c350e0840f9d80e761ca Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.042849 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-frgfj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7c7fc454ff-kpx9k_openstack-operators(15792c9d-8f60-4b13-8623-55c9a6a7319b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.042903 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-v6b9l"] Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.059288 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-922cb"] Oct 06 13:18:38 crc kubenswrapper[4867]: W1006 13:18:38.060480 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21147e7d_1dd6_4a90_ab7a_f923f014a281.slice/crio-a1e87a58560d2de8c1eebf5ca827aa36a53fda1399b5d199f0216a1ae38ba28d WatchSource:0}: Error finding container a1e87a58560d2de8c1eebf5ca827aa36a53fda1399b5d199f0216a1ae38ba28d: Status 404 returned error can't find the container with id a1e87a58560d2de8c1eebf5ca827aa36a53fda1399b5d199f0216a1ae38ba28d Oct 06 13:18:38 crc kubenswrapper[4867]: W1006 13:18:38.061276 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d369f1b_62ae_4b24_8287_fd62b21122ce.slice/crio-4837d7d69fe73fa2a62ca9aa5c58a98c225d7696dc9826d13c2ef40cf4c6136c WatchSource:0}: Error finding container 4837d7d69fe73fa2a62ca9aa5c58a98c225d7696dc9826d13c2ef40cf4c6136c: Status 404 returned error can't find the container with id 4837d7d69fe73fa2a62ca9aa5c58a98c225d7696dc9826d13c2ef40cf4c6136c Oct 06 13:18:38 crc kubenswrapper[4867]: W1006 13:18:38.061922 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c53454b_e984_4366_8bd1_3c4eb10fb1c8.slice/crio-f5d53a0a3875dca805a31b63d10ea419ed39cf0a66c8417fb1a6efa71093da07 WatchSource:0}: Error finding container f5d53a0a3875dca805a31b63d10ea419ed39cf0a66c8417fb1a6efa71093da07: Status 404 returned error can't find the container with id f5d53a0a3875dca805a31b63d10ea419ed39cf0a66c8417fb1a6efa71093da07 Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.070231 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55dcdc7cc-z7lp5"] Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.070457 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6c8v6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg_openstack-operators(6c53454b-e984-4366-8bd1-3c4eb10fb1c8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.071756 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w2rcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-54689d9f88-922cb_openstack-operators(05580142-d01c-470a-afcb-da956c1f6d36): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 13:18:38 crc kubenswrapper[4867]: W1006 13:18:38.075356 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbe49bb4_18db_473e_b57c_2047bbbe2405.slice/crio-3bca5aeea4965075212add74eff98e76dc448ecda962b81c76b602fd8249a50e WatchSource:0}: Error finding container 3bca5aeea4965075212add74eff98e76dc448ecda962b81c76b602fd8249a50e: Status 404 returned error can't find the container with id 3bca5aeea4965075212add74eff98e76dc448ecda962b81c76b602fd8249a50e Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.075526 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-87w9x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd5cb47d7-tvpx8_openstack-operators(9d369f1b-62ae-4b24-8287-fd62b21122ce): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 13:18:38 crc kubenswrapper[4867]: W1006 13:18:38.076428 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefcff7d5_4481_45ea_b693_ebc63e9f1458.slice/crio-43a41454b1595786dc1895e910f2b80fd9c7e364a5463a7ca22e44b5cbe0e4a4 WatchSource:0}: Error finding container 43a41454b1595786dc1895e910f2b80fd9c7e364a5463a7ca22e44b5cbe0e4a4: Status 404 returned error can't find the container with id 43a41454b1595786dc1895e910f2b80fd9c7e364a5463a7ca22e44b5cbe0e4a4 Oct 06 13:18:38 crc kubenswrapper[4867]: W1006 13:18:38.076648 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod937696b7_f234_4e2e_97b3_9ef0f2bf0a90.slice/crio-dd1ea1382bae25881f93cf47d5fcea5db8bbf3bff66c6c42af9ef78d3dd3cc9c WatchSource:0}: Error finding container dd1ea1382bae25881f93cf47d5fcea5db8bbf3bff66c6c42af9ef78d3dd3cc9c: Status 404 returned error can't find the container with id dd1ea1382bae25881f93cf47d5fcea5db8bbf3bff66c6c42af9ef78d3dd3cc9c Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.079993 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vgvfw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt_openstack-operators(dbe49bb4-18db-473e-b57c-2047bbbe2405): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.080645 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:f37e29d1f621c23c0d77b09076006d1e8002a77c2ff3d9b8921f893221cb1d09,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v8jlh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-6d8b6f9b9-4lsh6_openstack-operators(efcff7d5-4481-45ea-b693-ebc63e9f1458): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.080948 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5qqfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-48lgc_openstack-operators(937696b7-f234-4e2e-97b3-9ef0f2bf0a90): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.080982 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc"] Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.086143 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg"] Oct 06 13:18:38 crc kubenswrapper[4867]: W1006 13:18:38.095555 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdb84464_dbf1_4dfc_9a87_b3dde7d0fcc3.slice/crio-e78142b41d4e5261d5d337046fc31c651005a7c8d8a24c65fd533295c52fddcc WatchSource:0}: Error finding container e78142b41d4e5261d5d337046fc31c651005a7c8d8a24c65fd533295c52fddcc: Status 404 returned error can't find the container with id e78142b41d4e5261d5d337046fc31c651005a7c8d8a24c65fd533295c52fddcc Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.095757 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-fwbwb"] Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.103971 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt"] Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.108591 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t2lc6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp_openstack-operators(bdb84464-dbf1-4dfc-9a87-b3dde7d0fcc3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.109810 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp" podUID="bdb84464-dbf1-4dfc-9a87-b3dde7d0fcc3" Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.114280 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp"] Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.124416 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5"] Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.410418 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg" podUID="6c53454b-e984-4366-8bd1-3c4eb10fb1c8" Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.432273 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k" podUID="15792c9d-8f60-4b13-8623-55c9a6a7319b" Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.440675 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8" podUID="9d369f1b-62ae-4b24-8287-fd62b21122ce" Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.479858 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc" podUID="937696b7-f234-4e2e-97b3-9ef0f2bf0a90" Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.487960 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-922cb" podUID="05580142-d01c-470a-afcb-da956c1f6d36" Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.500299 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k" event={"ID":"15792c9d-8f60-4b13-8623-55c9a6a7319b","Type":"ContainerStarted","Data":"54fc025713ee4682163ceebd7b771db7ed09532c3adfdf18af2198c52fa8d9c3"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.500349 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k" event={"ID":"15792c9d-8f60-4b13-8623-55c9a6a7319b","Type":"ContainerStarted","Data":"a579c5c5276e0950c3f82cba683bcbefa52c76c0d540c350e0840f9d80e761ca"} Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.505862 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k" podUID="15792c9d-8f60-4b13-8623-55c9a6a7319b" Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.508131 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" event={"ID":"dbe49bb4-18db-473e-b57c-2047bbbe2405","Type":"ContainerStarted","Data":"3bca5aeea4965075212add74eff98e76dc448ecda962b81c76b602fd8249a50e"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.511543 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-v6b9l" event={"ID":"95e501d6-fddf-4baa-befd-25c5c5f3303e","Type":"ContainerStarted","Data":"dd0706e6ecf0dd59aeb00f955c2966e32dea48ad9fd273b0c96707ce5fbab00c"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.513955 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp" event={"ID":"bdb84464-dbf1-4dfc-9a87-b3dde7d0fcc3","Type":"ContainerStarted","Data":"e78142b41d4e5261d5d337046fc31c651005a7c8d8a24c65fd533295c52fddcc"} Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.518629 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp" podUID="bdb84464-dbf1-4dfc-9a87-b3dde7d0fcc3" Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.519536 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wrrdj" event={"ID":"d7b781c6-8500-43b4-884d-e67aadad8518","Type":"ContainerStarted","Data":"8254bd420619d0c0164bb919f135959c01a7591b06eff1c0899d1337860823f7"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.529787 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" event={"ID":"94790623-543f-45ee-9579-6e837ce82cd8","Type":"ContainerStarted","Data":"9e6769da76a5204fcc19a9965a0fbf9939e5645ab44afd81f973557efbba1238"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.529840 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" event={"ID":"94790623-543f-45ee-9579-6e837ce82cd8","Type":"ContainerStarted","Data":"b3046ecdbdfde3b78fc7d252d624826f2187f8fd34d51cf8ac6a940cf83ef042"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.532483 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jxt5n" event={"ID":"5d14ff34-79c1-467d-99b0-35202d1650bb","Type":"ContainerStarted","Data":"005e5e332673dfc666ad4bacc95b739821b036a5903e9c22877db6a100472af6"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.536175 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-njdr6" event={"ID":"831b6b11-3e22-4ae1-aa26-1ccb9a6bacb7","Type":"ContainerStarted","Data":"903414ce0ce4421a917c3c0aca88bf095774c5ac250a4062d63c345ca5ffc768"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.568709 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-fwbwb" event={"ID":"21147e7d-1dd6-4a90-ab7a-f923f014a281","Type":"ContainerStarted","Data":"a1e87a58560d2de8c1eebf5ca827aa36a53fda1399b5d199f0216a1ae38ba28d"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.587698 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-qhnpp" event={"ID":"92cf840d-e92d-4212-8d63-2d623040ca46","Type":"ContainerStarted","Data":"e6ddf09ead7f9836f10b2750022a4d6cc280f0aa66baeed559b9a2812eec687e"} Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.602649 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" podUID="efcff7d5-4481-45ea-b693-ebc63e9f1458" Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.610536 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s4qrw" event={"ID":"3b46e0ea-7a30-45ab-99cc-d36efd3fc75e","Type":"ContainerStarted","Data":"30684887bb346eba4460a8d9c92439584d5067a486578cf819d12b8183b40667"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.631981 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-84drf" event={"ID":"7050df56-39f0-4962-878b-7e9c498d86d4","Type":"ContainerStarted","Data":"f9334f44cb9c57b4e844a5909110c658ba892a0ae695c2ed1ff018acb43e2bf8"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.636280 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-7jwqc" event={"ID":"3d2faf90-2410-459e-a8a3-668296923f2e","Type":"ContainerStarted","Data":"eb2a8a3a83e791d92157e6cec9bc9843614d0c80abcd9870634d3003edb87064"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.638829 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc" event={"ID":"937696b7-f234-4e2e-97b3-9ef0f2bf0a90","Type":"ContainerStarted","Data":"5e9fee248526e6e81492e6e6e2918732a3900b1bb74e9d06cd363038c4a453e4"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.638856 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc" event={"ID":"937696b7-f234-4e2e-97b3-9ef0f2bf0a90","Type":"ContainerStarted","Data":"dd1ea1382bae25881f93cf47d5fcea5db8bbf3bff66c6c42af9ef78d3dd3cc9c"} Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.641375 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc" podUID="937696b7-f234-4e2e-97b3-9ef0f2bf0a90" Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.644085 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55dcdc7cc-z7lp5" event={"ID":"b099322d-539c-4c48-9344-62e1fec437ab","Type":"ContainerStarted","Data":"16c2347ea564a24403500140cbea93f72ccd8953189f53e868043f4ab551fce9"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.649032 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-922cb" event={"ID":"05580142-d01c-470a-afcb-da956c1f6d36","Type":"ContainerStarted","Data":"001f59777c59c17f80043f5df6943389ca85477b1bb600ff6178f332b2fbbe1d"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.649066 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-922cb" event={"ID":"05580142-d01c-470a-afcb-da956c1f6d36","Type":"ContainerStarted","Data":"e2e7cbeea3cfb19d165ef82f72f29179a859e60345205ec147709685739ef6c2"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.654240 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" event={"ID":"efcff7d5-4481-45ea-b693-ebc63e9f1458","Type":"ContainerStarted","Data":"43a41454b1595786dc1895e910f2b80fd9c7e364a5463a7ca22e44b5cbe0e4a4"} Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.663157 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f37e29d1f621c23c0d77b09076006d1e8002a77c2ff3d9b8921f893221cb1d09\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" podUID="efcff7d5-4481-45ea-b693-ebc63e9f1458" Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.663363 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-922cb" podUID="05580142-d01c-470a-afcb-da956c1f6d36" Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.677984 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8" event={"ID":"9d369f1b-62ae-4b24-8287-fd62b21122ce","Type":"ContainerStarted","Data":"562f1041e5a54f5edbfbe135f7f2e8daff42573e5f69b8bbe42ffc3f4ddea235"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.678060 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8" event={"ID":"9d369f1b-62ae-4b24-8287-fd62b21122ce","Type":"ContainerStarted","Data":"4837d7d69fe73fa2a62ca9aa5c58a98c225d7696dc9826d13c2ef40cf4c6136c"} Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.681640 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8" podUID="9d369f1b-62ae-4b24-8287-fd62b21122ce" Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.689944 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" event={"ID":"311ba4cb-158b-41f4-ada4-4fed1c0f2ede","Type":"ContainerStarted","Data":"62b2f51f99ee12ee8c99023566b97f7d2d431a687f7950f824145b0d83c23228"} Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.712057 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" podUID="dbe49bb4-18db-473e-b57c-2047bbbe2405" Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.720129 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg" event={"ID":"6c53454b-e984-4366-8bd1-3c4eb10fb1c8","Type":"ContainerStarted","Data":"a64984a575f702c45809c46ee0e511448ba104ee948a129114cbc0c8961a38c3"} Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.720177 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg" event={"ID":"6c53454b-e984-4366-8bd1-3c4eb10fb1c8","Type":"ContainerStarted","Data":"f5d53a0a3875dca805a31b63d10ea419ed39cf0a66c8417fb1a6efa71093da07"} Oct 06 13:18:38 crc kubenswrapper[4867]: E1006 13:18:38.730567 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg" podUID="6c53454b-e984-4366-8bd1-3c4eb10fb1c8" Oct 06 13:18:38 crc kubenswrapper[4867]: I1006 13:18:38.756923 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-v222m" event={"ID":"901a13c6-49ea-4126-8b2d-7c7901720f05","Type":"ContainerStarted","Data":"8c7b5c9508e3f3452009f9261be17fd12538271615de9878c6425330f90577e1"} Oct 06 13:18:39 crc kubenswrapper[4867]: I1006 13:18:39.797383 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" event={"ID":"94790623-543f-45ee-9579-6e837ce82cd8","Type":"ContainerStarted","Data":"a05c7a6ae52732b3688654aacaf64cb68ad88a1b4989ef3c45490fb938c1b384"} Oct 06 13:18:39 crc kubenswrapper[4867]: I1006 13:18:39.797864 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" Oct 06 13:18:39 crc kubenswrapper[4867]: I1006 13:18:39.802847 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" event={"ID":"dbe49bb4-18db-473e-b57c-2047bbbe2405","Type":"ContainerStarted","Data":"2f62af2fa68457fd66ca2e8d3655d2b3e9d688509063f8c0e71a035e6d91b6c3"} Oct 06 13:18:39 crc kubenswrapper[4867]: E1006 13:18:39.809925 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" podUID="dbe49bb4-18db-473e-b57c-2047bbbe2405" Oct 06 13:18:39 crc kubenswrapper[4867]: I1006 13:18:39.814180 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" event={"ID":"efcff7d5-4481-45ea-b693-ebc63e9f1458","Type":"ContainerStarted","Data":"5a95e48c4cd5753a7a9570170de978d38c586e6db177cb3e8dec78698749cc7d"} Oct 06 13:18:39 crc kubenswrapper[4867]: E1006 13:18:39.820662 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp" podUID="bdb84464-dbf1-4dfc-9a87-b3dde7d0fcc3" Oct 06 13:18:39 crc kubenswrapper[4867]: E1006 13:18:39.820752 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc" podUID="937696b7-f234-4e2e-97b3-9ef0f2bf0a90" Oct 06 13:18:39 crc kubenswrapper[4867]: E1006 13:18:39.820787 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg" podUID="6c53454b-e984-4366-8bd1-3c4eb10fb1c8" Oct 06 13:18:39 crc kubenswrapper[4867]: E1006 13:18:39.820834 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f37e29d1f621c23c0d77b09076006d1e8002a77c2ff3d9b8921f893221cb1d09\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" podUID="efcff7d5-4481-45ea-b693-ebc63e9f1458" Oct 06 13:18:39 crc kubenswrapper[4867]: E1006 13:18:39.820898 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:e4c4ff39c54c0af231fb781759ab50ed86285c74d38bdea43fa75646b762d842\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k" podUID="15792c9d-8f60-4b13-8623-55c9a6a7319b" Oct 06 13:18:39 crc kubenswrapper[4867]: E1006 13:18:39.824003 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8" podUID="9d369f1b-62ae-4b24-8287-fd62b21122ce" Oct 06 13:18:39 crc kubenswrapper[4867]: E1006 13:18:39.826377 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-922cb" podUID="05580142-d01c-470a-afcb-da956c1f6d36" Oct 06 13:18:39 crc kubenswrapper[4867]: I1006 13:18:39.840005 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" podStartSLOduration=3.839974327 podStartE2EDuration="3.839974327s" podCreationTimestamp="2025-10-06 13:18:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:18:39.821875413 +0000 UTC m=+899.279823567" watchObservedRunningTime="2025-10-06 13:18:39.839974327 +0000 UTC m=+899.297922471" Oct 06 13:18:40 crc kubenswrapper[4867]: E1006 13:18:40.835282 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f37e29d1f621c23c0d77b09076006d1e8002a77c2ff3d9b8921f893221cb1d09\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" podUID="efcff7d5-4481-45ea-b693-ebc63e9f1458" Oct 06 13:18:40 crc kubenswrapper[4867]: E1006 13:18:40.838352 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" podUID="dbe49bb4-18db-473e-b57c-2047bbbe2405" Oct 06 13:18:47 crc kubenswrapper[4867]: I1006 13:18:47.155797 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-66dbf6f685-4srz5" Oct 06 13:18:50 crc kubenswrapper[4867]: E1006 13:18:50.114269 4867 log.go:32] "ImageFsInfo from image service failed" err="rpc error: code = Unknown desc = get image fs info unable to get usage for /var/lib/containers/storage/overlay-images: get disk usage for path /var/lib/containers/storage/overlay-images: lstat /var/lib/containers/storage/overlay-images/.tmp-images.json1342856062: no such file or directory" Oct 06 13:18:50 crc kubenswrapper[4867]: E1006 13:18:50.114798 4867 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="missing image stats: nil" Oct 06 13:18:50 crc kubenswrapper[4867]: I1006 13:18:50.933029 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wrrdj" event={"ID":"d7b781c6-8500-43b4-884d-e67aadad8518","Type":"ContainerStarted","Data":"953f48165d13059eb79c39de482247709598366c414be649c8bc411fc8a24462"} Oct 06 13:18:50 crc kubenswrapper[4867]: I1006 13:18:50.944497 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-v222m" event={"ID":"901a13c6-49ea-4126-8b2d-7c7901720f05","Type":"ContainerStarted","Data":"4fbbaf66ffeaea793fab21e710342281dcf0475e36f670d70d9b6626ba24672a"} Oct 06 13:18:50 crc kubenswrapper[4867]: I1006 13:18:50.954641 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jxt5n" event={"ID":"5d14ff34-79c1-467d-99b0-35202d1650bb","Type":"ContainerStarted","Data":"5e8ea6b1ceb318bd4e6cda4b508a35080ca420b81e4b6e5775e72b786c7a823a"} Oct 06 13:18:50 crc kubenswrapper[4867]: I1006 13:18:50.958797 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-njdr6" event={"ID":"831b6b11-3e22-4ae1-aa26-1ccb9a6bacb7","Type":"ContainerStarted","Data":"4fb9d97fa93732869b3032aeca168161ca29ca9a8fe814df9469ebe191ca2f8b"} Oct 06 13:18:50 crc kubenswrapper[4867]: I1006 13:18:50.961115 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-fwbwb" event={"ID":"21147e7d-1dd6-4a90-ab7a-f923f014a281","Type":"ContainerStarted","Data":"0988fd8df964ba2875888d1cd24a9ee358ac1398814f425701469b504b7a5b31"} Oct 06 13:18:50 crc kubenswrapper[4867]: I1006 13:18:50.966533 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55dcdc7cc-z7lp5" event={"ID":"b099322d-539c-4c48-9344-62e1fec437ab","Type":"ContainerStarted","Data":"6b5f202ab78dfadf634a7d77645489064a34b684d5efb3a173d7984f5f894e9d"} Oct 06 13:18:50 crc kubenswrapper[4867]: I1006 13:18:50.971901 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v72nf" event={"ID":"3c3a38a7-d3a0-4c01-aae9-645d5dada80f","Type":"ContainerStarted","Data":"26880a9f29caadec5d1015bc0ddc40f1340a166cdacbdbe88599d83c34b3c0e6"} Oct 06 13:18:50 crc kubenswrapper[4867]: I1006 13:18:50.974659 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-4tgfn" event={"ID":"c0e4bb9c-5d2f-4a0f-9cb8-9133336c2536","Type":"ContainerStarted","Data":"b03ac9be796bffabcb7054d8f5619b990d84ab40858fa24163bd315255cd5de1"} Oct 06 13:18:50 crc kubenswrapper[4867]: I1006 13:18:50.977174 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s4qrw" event={"ID":"3b46e0ea-7a30-45ab-99cc-d36efd3fc75e","Type":"ContainerStarted","Data":"48960a46a84ccebfa1263594f1efe951a20f3c7fd06136c3ddb8c4cec71a98e3"} Oct 06 13:18:50 crc kubenswrapper[4867]: I1006 13:18:50.984636 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-7jwqc" event={"ID":"3d2faf90-2410-459e-a8a3-668296923f2e","Type":"ContainerStarted","Data":"80afbfaadaec3098a94a9d68142a66817d4358a0649ca7265c48fc44a6063c09"} Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.004452 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-4tgfn" event={"ID":"c0e4bb9c-5d2f-4a0f-9cb8-9133336c2536","Type":"ContainerStarted","Data":"0c4f3419fb679600edaf115c02db8cd716964727020bceb04a9d4d2a4305b9da"} Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.005862 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-4tgfn" Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.014515 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-qhnpp" event={"ID":"92cf840d-e92d-4212-8d63-2d623040ca46","Type":"ContainerStarted","Data":"ae06bdd8ed37b606e4fae61c2380414c8962c6b09cc78eccd7ded1e4ac61b506"} Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.024315 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-84drf" event={"ID":"7050df56-39f0-4962-878b-7e9c498d86d4","Type":"ContainerStarted","Data":"cd69f989da14712e1db3547768973342e756e01f3cd210bf501fd6aa507aeba4"} Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.029875 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-njdr6" event={"ID":"831b6b11-3e22-4ae1-aa26-1ccb9a6bacb7","Type":"ContainerStarted","Data":"efd4a9744b3995feb547120ffd6dda6c56b665eaebd32c803c3dae1b89e29762"} Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.030000 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-njdr6" Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.031991 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wrrdj" event={"ID":"d7b781c6-8500-43b4-884d-e67aadad8518","Type":"ContainerStarted","Data":"065866daa058d20f7ac2ef9b74b8d880150491ad91e7eec18d336295a434c7e7"} Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.032426 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wrrdj" Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.035434 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" event={"ID":"311ba4cb-158b-41f4-ada4-4fed1c0f2ede","Type":"ContainerStarted","Data":"c7c727bbc4f3e66e962f0d2409625969e78073cd614840a41a3416b9c7f40ce6"} Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.041418 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-4tgfn" podStartSLOduration=4.209957064 podStartE2EDuration="17.041397926s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:37.319011338 +0000 UTC m=+896.776959482" lastFinishedPulling="2025-10-06 13:18:50.15045219 +0000 UTC m=+909.608400344" observedRunningTime="2025-10-06 13:18:52.03934766 +0000 UTC m=+911.497295804" watchObservedRunningTime="2025-10-06 13:18:52.041397926 +0000 UTC m=+911.499346070" Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.048741 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-v222m" event={"ID":"901a13c6-49ea-4126-8b2d-7c7901720f05","Type":"ContainerStarted","Data":"93c0dc782d0a2ed7701d0bc58c07243e7821c2224b02be5010f21c70a8140e40"} Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.049735 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-v222m" Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.054430 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55dcdc7cc-z7lp5" event={"ID":"b099322d-539c-4c48-9344-62e1fec437ab","Type":"ContainerStarted","Data":"480a015b84f954f226e8fa7df4c5f9a21941a15254ec208d352aec7f3c930e98"} Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.054651 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-55dcdc7cc-z7lp5" Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.061789 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jxt5n" event={"ID":"5d14ff34-79c1-467d-99b0-35202d1650bb","Type":"ContainerStarted","Data":"645077df9eae24219a92c6a42174a2101c733f5aab6d5ff65c5dd7a961d8d4c4"} Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.061955 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jxt5n" Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.065312 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wrrdj" podStartSLOduration=4.893477788 podStartE2EDuration="17.065293838s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:37.994649337 +0000 UTC m=+897.452597471" lastFinishedPulling="2025-10-06 13:18:50.166465357 +0000 UTC m=+909.624413521" observedRunningTime="2025-10-06 13:18:52.065279298 +0000 UTC m=+911.523227442" watchObservedRunningTime="2025-10-06 13:18:52.065293838 +0000 UTC m=+911.523241982" Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.066406 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s4qrw" event={"ID":"3b46e0ea-7a30-45ab-99cc-d36efd3fc75e","Type":"ContainerStarted","Data":"512f3acca1a1655891e59e25ce0d5333a500e8e9fa60480a20ad6e23e81b8ba7"} Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.066460 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s4qrw" Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.071386 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-v6b9l" event={"ID":"95e501d6-fddf-4baa-befd-25c5c5f3303e","Type":"ContainerStarted","Data":"858f7b57c763152353ff2dd97586fa561d6f876366a0803fa1aeb74732319d3e"} Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.094632 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-njdr6" podStartSLOduration=4.903242645 podStartE2EDuration="17.094615789s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:37.970735834 +0000 UTC m=+897.428683978" lastFinishedPulling="2025-10-06 13:18:50.162108978 +0000 UTC m=+909.620057122" observedRunningTime="2025-10-06 13:18:52.092700896 +0000 UTC m=+911.550649040" watchObservedRunningTime="2025-10-06 13:18:52.094615789 +0000 UTC m=+911.552563933" Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.138594 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s4qrw" podStartSLOduration=4.475226853 podStartE2EDuration="17.138572528s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:37.483672031 +0000 UTC m=+896.941620175" lastFinishedPulling="2025-10-06 13:18:50.147017676 +0000 UTC m=+909.604965850" observedRunningTime="2025-10-06 13:18:52.115270102 +0000 UTC m=+911.573218246" watchObservedRunningTime="2025-10-06 13:18:52.138572528 +0000 UTC m=+911.596520672" Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.143418 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-v222m" podStartSLOduration=4.951620055 podStartE2EDuration="17.14340232s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:37.969194472 +0000 UTC m=+897.427142616" lastFinishedPulling="2025-10-06 13:18:50.160976737 +0000 UTC m=+909.618924881" observedRunningTime="2025-10-06 13:18:52.137267043 +0000 UTC m=+911.595215197" watchObservedRunningTime="2025-10-06 13:18:52.14340232 +0000 UTC m=+911.601350464" Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.164284 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jxt5n" podStartSLOduration=4.459532626 podStartE2EDuration="17.164230799s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:37.457006324 +0000 UTC m=+896.914954468" lastFinishedPulling="2025-10-06 13:18:50.161704497 +0000 UTC m=+909.619652641" observedRunningTime="2025-10-06 13:18:52.160822995 +0000 UTC m=+911.618771139" watchObservedRunningTime="2025-10-06 13:18:52.164230799 +0000 UTC m=+911.622178943" Oct 06 13:18:52 crc kubenswrapper[4867]: I1006 13:18:52.198141 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-55dcdc7cc-z7lp5" podStartSLOduration=5.106064149 podStartE2EDuration="17.198118703s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:38.026764683 +0000 UTC m=+897.484712827" lastFinishedPulling="2025-10-06 13:18:50.118819227 +0000 UTC m=+909.576767381" observedRunningTime="2025-10-06 13:18:52.193115147 +0000 UTC m=+911.651063291" watchObservedRunningTime="2025-10-06 13:18:52.198118703 +0000 UTC m=+911.656066847" Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.081565 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-v6b9l" event={"ID":"95e501d6-fddf-4baa-befd-25c5c5f3303e","Type":"ContainerStarted","Data":"53b61c9f7d40d1d6a72137d13fda4c048ab1546791c0269bc9a4c65d2d267dc4"} Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.083120 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-v6b9l" Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.085967 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" event={"ID":"311ba4cb-158b-41f4-ada4-4fed1c0f2ede","Type":"ContainerStarted","Data":"346dfb355da0ad2ec0375a85b85c0fbd72f60700b693707cdbaa7ce31530711e"} Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.086508 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.101750 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-fwbwb" event={"ID":"21147e7d-1dd6-4a90-ab7a-f923f014a281","Type":"ContainerStarted","Data":"04bd0e176ce6f2cea22e8e35c1b3e476806cc8f34881007a2c721923738aaa75"} Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.102786 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-fwbwb" Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.103874 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-v6b9l" podStartSLOduration=5.921952723 podStartE2EDuration="18.103850278s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:38.034626588 +0000 UTC m=+897.492574732" lastFinishedPulling="2025-10-06 13:18:50.216524153 +0000 UTC m=+909.674472287" observedRunningTime="2025-10-06 13:18:53.101081092 +0000 UTC m=+912.559029236" watchObservedRunningTime="2025-10-06 13:18:53.103850278 +0000 UTC m=+912.561798422" Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.112502 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v72nf" event={"ID":"3c3a38a7-d3a0-4c01-aae9-645d5dada80f","Type":"ContainerStarted","Data":"12cbf43742d1a4c61485cc800ddf0f97eab4d3fbf86e5614acb0b28d9a4311e2"} Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.112690 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v72nf" Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.118490 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-qhnpp" event={"ID":"92cf840d-e92d-4212-8d63-2d623040ca46","Type":"ContainerStarted","Data":"5f241c6500ec00b2affe1f9cd9ba874fc155ce8b716e18ca40909cbad26afd83"} Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.120531 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-649675d675-qhnpp" Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.127947 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" podStartSLOduration=5.925896571 podStartE2EDuration="18.127924816s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:38.007208429 +0000 UTC m=+897.465156573" lastFinishedPulling="2025-10-06 13:18:50.209236674 +0000 UTC m=+909.667184818" observedRunningTime="2025-10-06 13:18:53.125539621 +0000 UTC m=+912.583487765" watchObservedRunningTime="2025-10-06 13:18:53.127924816 +0000 UTC m=+912.585872960" Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.131489 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-84drf" event={"ID":"7050df56-39f0-4962-878b-7e9c498d86d4","Type":"ContainerStarted","Data":"7ac533414454450196c985522c562f26b6d3f9ad42aed3dd960e82f5b9d6a1e9"} Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.132279 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-84drf" Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.136708 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-7jwqc" event={"ID":"3d2faf90-2410-459e-a8a3-668296923f2e","Type":"ContainerStarted","Data":"6f5b65a894fb1a4060908d7ce9d07e180d94275dd77a46fa58c14b6303548352"} Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.136746 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-7jwqc" Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.150053 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v72nf" podStartSLOduration=5.23585802 podStartE2EDuration="18.15003104s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:37.252613886 +0000 UTC m=+896.710562030" lastFinishedPulling="2025-10-06 13:18:50.166786906 +0000 UTC m=+909.624735050" observedRunningTime="2025-10-06 13:18:53.145871947 +0000 UTC m=+912.603820081" watchObservedRunningTime="2025-10-06 13:18:53.15003104 +0000 UTC m=+912.607979174" Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.170156 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-649675d675-qhnpp" podStartSLOduration=5.946745202 podStartE2EDuration="18.17012748s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:37.940378895 +0000 UTC m=+897.398327029" lastFinishedPulling="2025-10-06 13:18:50.163761153 +0000 UTC m=+909.621709307" observedRunningTime="2025-10-06 13:18:53.163026746 +0000 UTC m=+912.620974910" watchObservedRunningTime="2025-10-06 13:18:53.17012748 +0000 UTC m=+912.628075624" Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.184205 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-fwbwb" podStartSLOduration=6.084742809 podStartE2EDuration="18.184186224s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:38.062624522 +0000 UTC m=+897.520572666" lastFinishedPulling="2025-10-06 13:18:50.162067927 +0000 UTC m=+909.620016081" observedRunningTime="2025-10-06 13:18:53.179561138 +0000 UTC m=+912.637509282" watchObservedRunningTime="2025-10-06 13:18:53.184186224 +0000 UTC m=+912.642134368" Oct 06 13:18:53 crc kubenswrapper[4867]: I1006 13:18:53.207550 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-7jwqc" podStartSLOduration=6.072913907 podStartE2EDuration="18.207526142s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:38.028195782 +0000 UTC m=+897.486143926" lastFinishedPulling="2025-10-06 13:18:50.162808017 +0000 UTC m=+909.620756161" observedRunningTime="2025-10-06 13:18:53.199802881 +0000 UTC m=+912.657751025" watchObservedRunningTime="2025-10-06 13:18:53.207526142 +0000 UTC m=+912.665474286" Oct 06 13:18:55 crc kubenswrapper[4867]: I1006 13:18:55.166393 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k" event={"ID":"15792c9d-8f60-4b13-8623-55c9a6a7319b","Type":"ContainerStarted","Data":"5a2fcb8ef8f96b9179ec24ea7290fb7f429cc886c099135fad0ffbfb33553d62"} Oct 06 13:18:55 crc kubenswrapper[4867]: I1006 13:18:55.167237 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k" Oct 06 13:18:55 crc kubenswrapper[4867]: I1006 13:18:55.168644 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp" event={"ID":"bdb84464-dbf1-4dfc-9a87-b3dde7d0fcc3","Type":"ContainerStarted","Data":"f917bfc1467f1acbce5b1df64b5da797b51ef36fe6c12b0832f3d93b361acd58"} Oct 06 13:18:55 crc kubenswrapper[4867]: I1006 13:18:55.171025 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8" event={"ID":"9d369f1b-62ae-4b24-8287-fd62b21122ce","Type":"ContainerStarted","Data":"adbbc23fb4104d9c9b85d9a1c01912323789936ddf326377c6b9beefa9b10c02"} Oct 06 13:18:55 crc kubenswrapper[4867]: I1006 13:18:55.175535 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-fwbwb" Oct 06 13:18:55 crc kubenswrapper[4867]: I1006 13:18:55.188375 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k" podStartSLOduration=3.6396633400000002 podStartE2EDuration="20.188351434s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:38.042646006 +0000 UTC m=+897.500594150" lastFinishedPulling="2025-10-06 13:18:54.59133408 +0000 UTC m=+914.049282244" observedRunningTime="2025-10-06 13:18:55.187237684 +0000 UTC m=+914.645185858" watchObservedRunningTime="2025-10-06 13:18:55.188351434 +0000 UTC m=+914.646299678" Oct 06 13:18:55 crc kubenswrapper[4867]: I1006 13:18:55.197958 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-84drf" podStartSLOduration=7.513205528 podStartE2EDuration="20.197934186s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:37.483644791 +0000 UTC m=+896.941592935" lastFinishedPulling="2025-10-06 13:18:50.168373449 +0000 UTC m=+909.626321593" observedRunningTime="2025-10-06 13:18:53.225041551 +0000 UTC m=+912.682989695" watchObservedRunningTime="2025-10-06 13:18:55.197934186 +0000 UTC m=+914.655882330" Oct 06 13:18:55 crc kubenswrapper[4867]: I1006 13:18:55.211516 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8" podStartSLOduration=3.701260373 podStartE2EDuration="20.211497077s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:38.07538548 +0000 UTC m=+897.533333624" lastFinishedPulling="2025-10-06 13:18:54.585622184 +0000 UTC m=+914.043570328" observedRunningTime="2025-10-06 13:18:55.210209332 +0000 UTC m=+914.668157476" watchObservedRunningTime="2025-10-06 13:18:55.211497077 +0000 UTC m=+914.669445221" Oct 06 13:18:55 crc kubenswrapper[4867]: I1006 13:18:55.254698 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp" podStartSLOduration=2.767917845 podStartE2EDuration="19.254669458s" podCreationTimestamp="2025-10-06 13:18:36 +0000 UTC" firstStartedPulling="2025-10-06 13:18:38.108397051 +0000 UTC m=+897.566345195" lastFinishedPulling="2025-10-06 13:18:54.595148654 +0000 UTC m=+914.053096808" observedRunningTime="2025-10-06 13:18:55.249172047 +0000 UTC m=+914.707120201" watchObservedRunningTime="2025-10-06 13:18:55.254669458 +0000 UTC m=+914.712617612" Oct 06 13:18:55 crc kubenswrapper[4867]: I1006 13:18:55.589822 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-4tgfn" Oct 06 13:18:55 crc kubenswrapper[4867]: I1006 13:18:55.621121 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-v72nf" Oct 06 13:18:55 crc kubenswrapper[4867]: I1006 13:18:55.690344 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s4qrw" Oct 06 13:18:55 crc kubenswrapper[4867]: I1006 13:18:55.730729 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-jxt5n" Oct 06 13:18:55 crc kubenswrapper[4867]: I1006 13:18:55.798907 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-v222m" Oct 06 13:18:56 crc kubenswrapper[4867]: I1006 13:18:56.010627 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-njdr6" Oct 06 13:18:56 crc kubenswrapper[4867]: I1006 13:18:56.093958 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-7jwqc" Oct 06 13:18:56 crc kubenswrapper[4867]: I1006 13:18:56.225973 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-v6b9l" Oct 06 13:18:56 crc kubenswrapper[4867]: I1006 13:18:56.418366 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-n64zf" Oct 06 13:18:56 crc kubenswrapper[4867]: I1006 13:18:56.514510 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wrrdj" Oct 06 13:18:56 crc kubenswrapper[4867]: I1006 13:18:56.542046 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8" Oct 06 13:18:56 crc kubenswrapper[4867]: I1006 13:18:56.591602 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-55dcdc7cc-z7lp5" Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.219494 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" event={"ID":"efcff7d5-4481-45ea-b693-ebc63e9f1458","Type":"ContainerStarted","Data":"dc2c31d606068367f568fac3430d108d41d5ccea96a3a8313b87f3b72b661332"} Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.220122 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.231681 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" event={"ID":"dbe49bb4-18db-473e-b57c-2047bbbe2405","Type":"ContainerStarted","Data":"41fe1b135c1ed62596ede1e36782d01bc9a43b672b3caf51a2d26c320c91a025"} Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.231732 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc" event={"ID":"937696b7-f234-4e2e-97b3-9ef0f2bf0a90","Type":"ContainerStarted","Data":"0ecd13f5bf62676a5585d787c4c5932b57276d61e2e56c889805fbb0aa9c956c"} Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.231747 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg" event={"ID":"6c53454b-e984-4366-8bd1-3c4eb10fb1c8","Type":"ContainerStarted","Data":"1e64065a3edce0fc9b57fd36d0e97085fe2145a8e733a862a48f94ddd562ac4c"} Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.231758 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-922cb" event={"ID":"05580142-d01c-470a-afcb-da956c1f6d36","Type":"ContainerStarted","Data":"0632e12c52be07707d768819d86b9d406b3966cfdea80cae9b8508bebca39c32"} Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.232674 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-922cb" Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.232801 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc" Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.233150 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg" Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.233214 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.279533 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" podStartSLOduration=3.674680717 podStartE2EDuration="24.279497258s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:38.080464249 +0000 UTC m=+897.538412393" lastFinishedPulling="2025-10-06 13:18:58.68528079 +0000 UTC m=+918.143228934" observedRunningTime="2025-10-06 13:18:59.250063343 +0000 UTC m=+918.708011487" watchObservedRunningTime="2025-10-06 13:18:59.279497258 +0000 UTC m=+918.737445402" Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.280791 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" podStartSLOduration=3.61081559 podStartE2EDuration="24.280783143s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:38.079577854 +0000 UTC m=+897.537525998" lastFinishedPulling="2025-10-06 13:18:58.749545407 +0000 UTC m=+918.207493551" observedRunningTime="2025-10-06 13:18:59.278074309 +0000 UTC m=+918.736022453" watchObservedRunningTime="2025-10-06 13:18:59.280783143 +0000 UTC m=+918.738731287" Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.301929 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-922cb" podStartSLOduration=3.687683992 podStartE2EDuration="24.30190459s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:38.071605247 +0000 UTC m=+897.529553391" lastFinishedPulling="2025-10-06 13:18:58.685825845 +0000 UTC m=+918.143773989" observedRunningTime="2025-10-06 13:18:59.298405715 +0000 UTC m=+918.756353859" watchObservedRunningTime="2025-10-06 13:18:59.30190459 +0000 UTC m=+918.759852734" Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.322488 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg" podStartSLOduration=3.701667234 podStartE2EDuration="24.322468392s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:38.07024318 +0000 UTC m=+897.528191324" lastFinishedPulling="2025-10-06 13:18:58.691044338 +0000 UTC m=+918.148992482" observedRunningTime="2025-10-06 13:18:59.321030283 +0000 UTC m=+918.778978427" watchObservedRunningTime="2025-10-06 13:18:59.322468392 +0000 UTC m=+918.780416536" Oct 06 13:18:59 crc kubenswrapper[4867]: I1006 13:18:59.343427 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc" podStartSLOduration=3.737106433 podStartE2EDuration="24.343397715s" podCreationTimestamp="2025-10-06 13:18:35 +0000 UTC" firstStartedPulling="2025-10-06 13:18:38.080856009 +0000 UTC m=+897.538804153" lastFinishedPulling="2025-10-06 13:18:58.687147291 +0000 UTC m=+918.145095435" observedRunningTime="2025-10-06 13:18:59.337719019 +0000 UTC m=+918.795667163" watchObservedRunningTime="2025-10-06 13:18:59.343397715 +0000 UTC m=+918.801345869" Oct 06 13:19:05 crc kubenswrapper[4867]: I1006 13:19:05.662706 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-84drf" Oct 06 13:19:05 crc kubenswrapper[4867]: I1006 13:19:05.889873 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-649675d675-qhnpp" Oct 06 13:19:06 crc kubenswrapper[4867]: I1006 13:19:06.050944 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg" Oct 06 13:19:06 crc kubenswrapper[4867]: I1006 13:19:06.160967 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-kpx9k" Oct 06 13:19:06 crc kubenswrapper[4867]: I1006 13:19:06.450611 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-4lsh6" Oct 06 13:19:06 crc kubenswrapper[4867]: I1006 13:19:06.478357 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-922cb" Oct 06 13:19:06 crc kubenswrapper[4867]: I1006 13:19:06.498587 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-48lgc" Oct 06 13:19:06 crc kubenswrapper[4867]: I1006 13:19:06.543752 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-tvpx8" Oct 06 13:19:06 crc kubenswrapper[4867]: I1006 13:19:06.868140 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.630462 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6697f74bb9-x98l4"] Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.632444 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6697f74bb9-x98l4" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.638779 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.639448 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.639557 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.639732 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bv75c" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.646590 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnp6l\" (UniqueName: \"kubernetes.io/projected/f3b4b5c6-a702-4fae-9d27-d09bdc486da8-kube-api-access-lnp6l\") pod \"dnsmasq-dns-6697f74bb9-x98l4\" (UID: \"f3b4b5c6-a702-4fae-9d27-d09bdc486da8\") " pod="openstack/dnsmasq-dns-6697f74bb9-x98l4" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.646689 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3b4b5c6-a702-4fae-9d27-d09bdc486da8-config\") pod \"dnsmasq-dns-6697f74bb9-x98l4\" (UID: \"f3b4b5c6-a702-4fae-9d27-d09bdc486da8\") " pod="openstack/dnsmasq-dns-6697f74bb9-x98l4" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.648958 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6697f74bb9-x98l4"] Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.716159 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64ff4bc6cc-gmv4z"] Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.717734 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.720695 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.733222 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64ff4bc6cc-gmv4z"] Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.748956 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnp6l\" (UniqueName: \"kubernetes.io/projected/f3b4b5c6-a702-4fae-9d27-d09bdc486da8-kube-api-access-lnp6l\") pod \"dnsmasq-dns-6697f74bb9-x98l4\" (UID: \"f3b4b5c6-a702-4fae-9d27-d09bdc486da8\") " pod="openstack/dnsmasq-dns-6697f74bb9-x98l4" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.749012 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b621b65-c24b-4a46-96b9-19939ada50e0-config\") pod \"dnsmasq-dns-64ff4bc6cc-gmv4z\" (UID: \"6b621b65-c24b-4a46-96b9-19939ada50e0\") " pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.749072 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q2zr\" (UniqueName: \"kubernetes.io/projected/6b621b65-c24b-4a46-96b9-19939ada50e0-kube-api-access-9q2zr\") pod \"dnsmasq-dns-64ff4bc6cc-gmv4z\" (UID: \"6b621b65-c24b-4a46-96b9-19939ada50e0\") " pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.749337 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3b4b5c6-a702-4fae-9d27-d09bdc486da8-config\") pod \"dnsmasq-dns-6697f74bb9-x98l4\" (UID: \"f3b4b5c6-a702-4fae-9d27-d09bdc486da8\") " pod="openstack/dnsmasq-dns-6697f74bb9-x98l4" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.749453 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b621b65-c24b-4a46-96b9-19939ada50e0-dns-svc\") pod \"dnsmasq-dns-64ff4bc6cc-gmv4z\" (UID: \"6b621b65-c24b-4a46-96b9-19939ada50e0\") " pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.751838 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3b4b5c6-a702-4fae-9d27-d09bdc486da8-config\") pod \"dnsmasq-dns-6697f74bb9-x98l4\" (UID: \"f3b4b5c6-a702-4fae-9d27-d09bdc486da8\") " pod="openstack/dnsmasq-dns-6697f74bb9-x98l4" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.778552 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnp6l\" (UniqueName: \"kubernetes.io/projected/f3b4b5c6-a702-4fae-9d27-d09bdc486da8-kube-api-access-lnp6l\") pod \"dnsmasq-dns-6697f74bb9-x98l4\" (UID: \"f3b4b5c6-a702-4fae-9d27-d09bdc486da8\") " pod="openstack/dnsmasq-dns-6697f74bb9-x98l4" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.851572 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b621b65-c24b-4a46-96b9-19939ada50e0-config\") pod \"dnsmasq-dns-64ff4bc6cc-gmv4z\" (UID: \"6b621b65-c24b-4a46-96b9-19939ada50e0\") " pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.851678 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q2zr\" (UniqueName: \"kubernetes.io/projected/6b621b65-c24b-4a46-96b9-19939ada50e0-kube-api-access-9q2zr\") pod \"dnsmasq-dns-64ff4bc6cc-gmv4z\" (UID: \"6b621b65-c24b-4a46-96b9-19939ada50e0\") " pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.851775 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b621b65-c24b-4a46-96b9-19939ada50e0-dns-svc\") pod \"dnsmasq-dns-64ff4bc6cc-gmv4z\" (UID: \"6b621b65-c24b-4a46-96b9-19939ada50e0\") " pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.852614 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b621b65-c24b-4a46-96b9-19939ada50e0-dns-svc\") pod \"dnsmasq-dns-64ff4bc6cc-gmv4z\" (UID: \"6b621b65-c24b-4a46-96b9-19939ada50e0\") " pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.852661 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b621b65-c24b-4a46-96b9-19939ada50e0-config\") pod \"dnsmasq-dns-64ff4bc6cc-gmv4z\" (UID: \"6b621b65-c24b-4a46-96b9-19939ada50e0\") " pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.870799 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q2zr\" (UniqueName: \"kubernetes.io/projected/6b621b65-c24b-4a46-96b9-19939ada50e0-kube-api-access-9q2zr\") pod \"dnsmasq-dns-64ff4bc6cc-gmv4z\" (UID: \"6b621b65-c24b-4a46-96b9-19939ada50e0\") " pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" Oct 06 13:19:25 crc kubenswrapper[4867]: I1006 13:19:25.966942 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6697f74bb9-x98l4" Oct 06 13:19:26 crc kubenswrapper[4867]: I1006 13:19:26.043156 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" Oct 06 13:19:26 crc kubenswrapper[4867]: I1006 13:19:26.370213 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6697f74bb9-x98l4"] Oct 06 13:19:26 crc kubenswrapper[4867]: I1006 13:19:26.467437 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64ff4bc6cc-gmv4z"] Oct 06 13:19:26 crc kubenswrapper[4867]: W1006 13:19:26.471351 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b621b65_c24b_4a46_96b9_19939ada50e0.slice/crio-fa3d24486d31d14ed51c144bd48bf7c455a13ddb7e3f1121a21e90003d87395c WatchSource:0}: Error finding container fa3d24486d31d14ed51c144bd48bf7c455a13ddb7e3f1121a21e90003d87395c: Status 404 returned error can't find the container with id fa3d24486d31d14ed51c144bd48bf7c455a13ddb7e3f1121a21e90003d87395c Oct 06 13:19:26 crc kubenswrapper[4867]: I1006 13:19:26.488592 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6697f74bb9-x98l4" event={"ID":"f3b4b5c6-a702-4fae-9d27-d09bdc486da8","Type":"ContainerStarted","Data":"623beae4fcda638170413ee29db0a3a8cff5076873242cf9b39ab507a413deb4"} Oct 06 13:19:26 crc kubenswrapper[4867]: I1006 13:19:26.490435 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" event={"ID":"6b621b65-c24b-4a46-96b9-19939ada50e0","Type":"ContainerStarted","Data":"fa3d24486d31d14ed51c144bd48bf7c455a13ddb7e3f1121a21e90003d87395c"} Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.685141 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64ff4bc6cc-gmv4z"] Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.699133 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86d858c69c-6g9hk"] Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.700825 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.711857 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d858c69c-6g9hk"] Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.827579 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx8ml\" (UniqueName: \"kubernetes.io/projected/dd3db137-4c43-4e44-abb9-707d0a322393-kube-api-access-jx8ml\") pod \"dnsmasq-dns-86d858c69c-6g9hk\" (UID: \"dd3db137-4c43-4e44-abb9-707d0a322393\") " pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.827689 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd3db137-4c43-4e44-abb9-707d0a322393-config\") pod \"dnsmasq-dns-86d858c69c-6g9hk\" (UID: \"dd3db137-4c43-4e44-abb9-707d0a322393\") " pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.827733 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd3db137-4c43-4e44-abb9-707d0a322393-dns-svc\") pod \"dnsmasq-dns-86d858c69c-6g9hk\" (UID: \"dd3db137-4c43-4e44-abb9-707d0a322393\") " pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.929462 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd3db137-4c43-4e44-abb9-707d0a322393-config\") pod \"dnsmasq-dns-86d858c69c-6g9hk\" (UID: \"dd3db137-4c43-4e44-abb9-707d0a322393\") " pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.929533 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd3db137-4c43-4e44-abb9-707d0a322393-dns-svc\") pod \"dnsmasq-dns-86d858c69c-6g9hk\" (UID: \"dd3db137-4c43-4e44-abb9-707d0a322393\") " pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.929561 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx8ml\" (UniqueName: \"kubernetes.io/projected/dd3db137-4c43-4e44-abb9-707d0a322393-kube-api-access-jx8ml\") pod \"dnsmasq-dns-86d858c69c-6g9hk\" (UID: \"dd3db137-4c43-4e44-abb9-707d0a322393\") " pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.930550 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd3db137-4c43-4e44-abb9-707d0a322393-config\") pod \"dnsmasq-dns-86d858c69c-6g9hk\" (UID: \"dd3db137-4c43-4e44-abb9-707d0a322393\") " pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.930664 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd3db137-4c43-4e44-abb9-707d0a322393-dns-svc\") pod \"dnsmasq-dns-86d858c69c-6g9hk\" (UID: \"dd3db137-4c43-4e44-abb9-707d0a322393\") " pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.962131 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6697f74bb9-x98l4"] Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.968237 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx8ml\" (UniqueName: \"kubernetes.io/projected/dd3db137-4c43-4e44-abb9-707d0a322393-kube-api-access-jx8ml\") pod \"dnsmasq-dns-86d858c69c-6g9hk\" (UID: \"dd3db137-4c43-4e44-abb9-707d0a322393\") " pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.996581 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-786b66f8cc-8vwnt"] Oct 06 13:19:29 crc kubenswrapper[4867]: I1006 13:19:29.997881 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.006442 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-786b66f8cc-8vwnt"] Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.025595 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.131481 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-config\") pod \"dnsmasq-dns-786b66f8cc-8vwnt\" (UID: \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\") " pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.131547 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-dns-svc\") pod \"dnsmasq-dns-786b66f8cc-8vwnt\" (UID: \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\") " pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.131598 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srd9g\" (UniqueName: \"kubernetes.io/projected/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-kube-api-access-srd9g\") pod \"dnsmasq-dns-786b66f8cc-8vwnt\" (UID: \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\") " pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.233421 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srd9g\" (UniqueName: \"kubernetes.io/projected/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-kube-api-access-srd9g\") pod \"dnsmasq-dns-786b66f8cc-8vwnt\" (UID: \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\") " pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.233562 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-config\") pod \"dnsmasq-dns-786b66f8cc-8vwnt\" (UID: \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\") " pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.233603 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-dns-svc\") pod \"dnsmasq-dns-786b66f8cc-8vwnt\" (UID: \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\") " pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.234742 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-dns-svc\") pod \"dnsmasq-dns-786b66f8cc-8vwnt\" (UID: \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\") " pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.234842 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-config\") pod \"dnsmasq-dns-786b66f8cc-8vwnt\" (UID: \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\") " pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.252031 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srd9g\" (UniqueName: \"kubernetes.io/projected/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-kube-api-access-srd9g\") pod \"dnsmasq-dns-786b66f8cc-8vwnt\" (UID: \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\") " pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.318968 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.400678 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-786b66f8cc-8vwnt"] Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.416218 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c44d66bd9-sfd9p"] Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.419924 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.430229 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c44d66bd9-sfd9p"] Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.539479 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq9l4\" (UniqueName: \"kubernetes.io/projected/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-kube-api-access-gq9l4\") pod \"dnsmasq-dns-7c44d66bd9-sfd9p\" (UID: \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\") " pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.539543 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-config\") pod \"dnsmasq-dns-7c44d66bd9-sfd9p\" (UID: \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\") " pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.539597 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-dns-svc\") pod \"dnsmasq-dns-7c44d66bd9-sfd9p\" (UID: \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\") " pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.641409 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq9l4\" (UniqueName: \"kubernetes.io/projected/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-kube-api-access-gq9l4\") pod \"dnsmasq-dns-7c44d66bd9-sfd9p\" (UID: \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\") " pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.641470 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-config\") pod \"dnsmasq-dns-7c44d66bd9-sfd9p\" (UID: \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\") " pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.641529 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-dns-svc\") pod \"dnsmasq-dns-7c44d66bd9-sfd9p\" (UID: \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\") " pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.642541 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-config\") pod \"dnsmasq-dns-7c44d66bd9-sfd9p\" (UID: \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\") " pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.642603 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-dns-svc\") pod \"dnsmasq-dns-7c44d66bd9-sfd9p\" (UID: \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\") " pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.657972 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq9l4\" (UniqueName: \"kubernetes.io/projected/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-kube-api-access-gq9l4\") pod \"dnsmasq-dns-7c44d66bd9-sfd9p\" (UID: \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\") " pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.768909 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.867071 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.868894 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.871590 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.871706 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.871876 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.871935 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.873114 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bjnpx" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.873168 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.873362 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.896322 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.961091 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.961170 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.961197 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.961221 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.961289 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.961311 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.961331 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.961345 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.961376 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-config-data\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.961493 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9svjq\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-kube-api-access-9svjq\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:30 crc kubenswrapper[4867]: I1006 13:19:30.961720 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.063288 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.063354 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.063380 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.063430 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.063456 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.063478 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.063492 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.063527 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-config-data\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.063570 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9svjq\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-kube-api-access-9svjq\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.063589 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.063616 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.063902 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.064086 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.064199 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.064846 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.064908 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-config-data\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.065327 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.068837 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.069795 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.076751 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.080956 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.091092 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.092041 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9svjq\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-kube-api-access-9svjq\") pod \"rabbitmq-server-0\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.208955 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.218636 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.219975 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.232069 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.232275 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.232423 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.232607 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.232878 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.233092 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.237063 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-m7h52" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.238087 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.369399 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.369852 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.369892 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f249bfb-ab86-491d-9d1c-b3930fdea27d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.370049 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.370102 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.370178 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.370328 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f249bfb-ab86-491d-9d1c-b3930fdea27d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.370373 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.370424 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.370479 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.370984 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkc7h\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-kube-api-access-zkc7h\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.474710 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.474760 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.474790 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.474812 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f249bfb-ab86-491d-9d1c-b3930fdea27d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.474835 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.474867 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.474896 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.474920 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkc7h\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-kube-api-access-zkc7h\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.474965 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.474986 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.475005 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f249bfb-ab86-491d-9d1c-b3930fdea27d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.475795 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.478802 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.478820 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.479091 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.479390 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.481171 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.482168 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.482522 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f249bfb-ab86-491d-9d1c-b3930fdea27d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.484928 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.487699 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f249bfb-ab86-491d-9d1c-b3930fdea27d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.494852 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkc7h\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-kube-api-access-zkc7h\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.505599 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.554503 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.577622 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.586155 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.590643 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.590811 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-wx85x" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.590922 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.591125 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.591196 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.591690 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.591889 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.591901 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.680697 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4beec03b-3d57-4c36-a149-153bb022bd7a-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.681312 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4beec03b-3d57-4c36-a149-153bb022bd7a-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.681582 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4beec03b-3d57-4c36-a149-153bb022bd7a-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.681733 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4beec03b-3d57-4c36-a149-153bb022bd7a-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.681921 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4beec03b-3d57-4c36-a149-153bb022bd7a-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.682112 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.682235 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4beec03b-3d57-4c36-a149-153bb022bd7a-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.682550 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4beec03b-3d57-4c36-a149-153bb022bd7a-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.682752 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4beec03b-3d57-4c36-a149-153bb022bd7a-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.682876 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj49x\" (UniqueName: \"kubernetes.io/projected/4beec03b-3d57-4c36-a149-153bb022bd7a-kube-api-access-jj49x\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.682999 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4beec03b-3d57-4c36-a149-153bb022bd7a-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.785471 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.785536 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4beec03b-3d57-4c36-a149-153bb022bd7a-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.785568 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4beec03b-3d57-4c36-a149-153bb022bd7a-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.785615 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4beec03b-3d57-4c36-a149-153bb022bd7a-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.785647 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj49x\" (UniqueName: \"kubernetes.io/projected/4beec03b-3d57-4c36-a149-153bb022bd7a-kube-api-access-jj49x\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.785665 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4beec03b-3d57-4c36-a149-153bb022bd7a-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.785705 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4beec03b-3d57-4c36-a149-153bb022bd7a-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.785728 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4beec03b-3d57-4c36-a149-153bb022bd7a-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.785764 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4beec03b-3d57-4c36-a149-153bb022bd7a-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.785795 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4beec03b-3d57-4c36-a149-153bb022bd7a-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.785825 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4beec03b-3d57-4c36-a149-153bb022bd7a-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.786153 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4beec03b-3d57-4c36-a149-153bb022bd7a-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.786692 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4beec03b-3d57-4c36-a149-153bb022bd7a-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.787123 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4beec03b-3d57-4c36-a149-153bb022bd7a-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.787381 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4beec03b-3d57-4c36-a149-153bb022bd7a-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.787823 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.788290 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4beec03b-3d57-4c36-a149-153bb022bd7a-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.789590 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4beec03b-3d57-4c36-a149-153bb022bd7a-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.791097 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4beec03b-3d57-4c36-a149-153bb022bd7a-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.799436 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4beec03b-3d57-4c36-a149-153bb022bd7a-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.801021 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4beec03b-3d57-4c36-a149-153bb022bd7a-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.802269 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj49x\" (UniqueName: \"kubernetes.io/projected/4beec03b-3d57-4c36-a149-153bb022bd7a-kube-api-access-jj49x\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.815278 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"4beec03b-3d57-4c36-a149-153bb022bd7a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:31 crc kubenswrapper[4867]: I1006 13:19:31.917433 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.081364 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.085083 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.087556 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-9ztpm" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.088529 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.088944 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.089767 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.090585 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.092598 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.097390 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.238721 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ec109351-f578-4141-8193-44f6433880b3-secrets\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.238763 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec109351-f578-4141-8193-44f6433880b3-kolla-config\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.238794 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec109351-f578-4141-8193-44f6433880b3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.238842 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.238864 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec109351-f578-4141-8193-44f6433880b3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.239056 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec109351-f578-4141-8193-44f6433880b3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.239153 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec109351-f578-4141-8193-44f6433880b3-config-data-default\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.239222 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97nxx\" (UniqueName: \"kubernetes.io/projected/ec109351-f578-4141-8193-44f6433880b3-kube-api-access-97nxx\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.239313 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec109351-f578-4141-8193-44f6433880b3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.340692 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec109351-f578-4141-8193-44f6433880b3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.340841 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec109351-f578-4141-8193-44f6433880b3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.340912 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec109351-f578-4141-8193-44f6433880b3-config-data-default\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.340955 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97nxx\" (UniqueName: \"kubernetes.io/projected/ec109351-f578-4141-8193-44f6433880b3-kube-api-access-97nxx\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.340996 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec109351-f578-4141-8193-44f6433880b3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.341068 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec109351-f578-4141-8193-44f6433880b3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.341093 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ec109351-f578-4141-8193-44f6433880b3-secrets\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.341118 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec109351-f578-4141-8193-44f6433880b3-kolla-config\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.341173 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.341909 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec109351-f578-4141-8193-44f6433880b3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.342160 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.343630 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec109351-f578-4141-8193-44f6433880b3-kolla-config\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.344164 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec109351-f578-4141-8193-44f6433880b3-config-data-default\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.344217 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec109351-f578-4141-8193-44f6433880b3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.347419 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec109351-f578-4141-8193-44f6433880b3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.367764 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ec109351-f578-4141-8193-44f6433880b3-secrets\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.367893 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97nxx\" (UniqueName: \"kubernetes.io/projected/ec109351-f578-4141-8193-44f6433880b3-kube-api-access-97nxx\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.368000 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.372220 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec109351-f578-4141-8193-44f6433880b3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ec109351-f578-4141-8193-44f6433880b3\") " pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.423943 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.456821 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.461149 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.463314 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.464895 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-77x2m" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.464932 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.466595 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.468185 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.543989 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/acd2b7ce-fe29-4b71-b730-7b1212f4416d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.544172 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd2b7ce-fe29-4b71-b730-7b1212f4416d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.544293 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/acd2b7ce-fe29-4b71-b730-7b1212f4416d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.544328 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/acd2b7ce-fe29-4b71-b730-7b1212f4416d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.544412 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acd2b7ce-fe29-4b71-b730-7b1212f4416d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.544485 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/acd2b7ce-fe29-4b71-b730-7b1212f4416d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.544516 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/acd2b7ce-fe29-4b71-b730-7b1212f4416d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.544583 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlncd\" (UniqueName: \"kubernetes.io/projected/acd2b7ce-fe29-4b71-b730-7b1212f4416d-kube-api-access-dlncd\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.544651 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.647321 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/acd2b7ce-fe29-4b71-b730-7b1212f4416d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.647780 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd2b7ce-fe29-4b71-b730-7b1212f4416d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.647815 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/acd2b7ce-fe29-4b71-b730-7b1212f4416d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.647838 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/acd2b7ce-fe29-4b71-b730-7b1212f4416d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.647871 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acd2b7ce-fe29-4b71-b730-7b1212f4416d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.647895 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/acd2b7ce-fe29-4b71-b730-7b1212f4416d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.647911 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/acd2b7ce-fe29-4b71-b730-7b1212f4416d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.647928 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlncd\" (UniqueName: \"kubernetes.io/projected/acd2b7ce-fe29-4b71-b730-7b1212f4416d-kube-api-access-dlncd\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.647950 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.648095 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.648623 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/acd2b7ce-fe29-4b71-b730-7b1212f4416d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.648989 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/acd2b7ce-fe29-4b71-b730-7b1212f4416d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.649107 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/acd2b7ce-fe29-4b71-b730-7b1212f4416d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.650061 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acd2b7ce-fe29-4b71-b730-7b1212f4416d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.651485 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/acd2b7ce-fe29-4b71-b730-7b1212f4416d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.653391 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd2b7ce-fe29-4b71-b730-7b1212f4416d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.660748 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/acd2b7ce-fe29-4b71-b730-7b1212f4416d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.667991 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlncd\" (UniqueName: \"kubernetes.io/projected/acd2b7ce-fe29-4b71-b730-7b1212f4416d-kube-api-access-dlncd\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.673311 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"acd2b7ce-fe29-4b71-b730-7b1212f4416d\") " pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.782459 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.783871 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.786340 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.786446 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-zdkft" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.786596 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.786673 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.800218 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.850977 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-combined-ca-bundle\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.851034 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-config-data\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.851100 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njvr7\" (UniqueName: \"kubernetes.io/projected/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-kube-api-access-njvr7\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.851120 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-kolla-config\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.851179 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-memcached-tls-certs\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.953195 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-memcached-tls-certs\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.953295 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-combined-ca-bundle\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.953346 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-config-data\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.953425 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njvr7\" (UniqueName: \"kubernetes.io/projected/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-kube-api-access-njvr7\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.953450 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-kolla-config\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.954731 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-config-data\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.955685 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-kolla-config\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.957666 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-combined-ca-bundle\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.957882 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-memcached-tls-certs\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:34 crc kubenswrapper[4867]: I1006 13:19:34.977126 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njvr7\" (UniqueName: \"kubernetes.io/projected/40e8af9c-90c3-4d15-b8c8-c7b35447bf17-kube-api-access-njvr7\") pod \"memcached-0\" (UID: \"40e8af9c-90c3-4d15-b8c8-c7b35447bf17\") " pod="openstack/memcached-0" Oct 06 13:19:35 crc kubenswrapper[4867]: I1006 13:19:35.102548 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 13:19:36 crc kubenswrapper[4867]: I1006 13:19:36.628465 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 13:19:36 crc kubenswrapper[4867]: I1006 13:19:36.630128 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 13:19:36 crc kubenswrapper[4867]: I1006 13:19:36.633515 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-5xn6l" Oct 06 13:19:36 crc kubenswrapper[4867]: I1006 13:19:36.652000 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 13:19:36 crc kubenswrapper[4867]: I1006 13:19:36.686846 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4j7j\" (UniqueName: \"kubernetes.io/projected/469e79f5-1d34-4151-ae0b-81301742c10c-kube-api-access-t4j7j\") pod \"kube-state-metrics-0\" (UID: \"469e79f5-1d34-4151-ae0b-81301742c10c\") " pod="openstack/kube-state-metrics-0" Oct 06 13:19:36 crc kubenswrapper[4867]: I1006 13:19:36.789397 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4j7j\" (UniqueName: \"kubernetes.io/projected/469e79f5-1d34-4151-ae0b-81301742c10c-kube-api-access-t4j7j\") pod \"kube-state-metrics-0\" (UID: \"469e79f5-1d34-4151-ae0b-81301742c10c\") " pod="openstack/kube-state-metrics-0" Oct 06 13:19:36 crc kubenswrapper[4867]: I1006 13:19:36.824431 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4j7j\" (UniqueName: \"kubernetes.io/projected/469e79f5-1d34-4151-ae0b-81301742c10c-kube-api-access-t4j7j\") pod \"kube-state-metrics-0\" (UID: \"469e79f5-1d34-4151-ae0b-81301742c10c\") " pod="openstack/kube-state-metrics-0" Oct 06 13:19:36 crc kubenswrapper[4867]: I1006 13:19:36.953460 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 13:19:37 crc kubenswrapper[4867]: I1006 13:19:37.958295 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:19:37 crc kubenswrapper[4867]: I1006 13:19:37.961588 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:37 crc kubenswrapper[4867]: I1006 13:19:37.972857 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:19:37 crc kubenswrapper[4867]: I1006 13:19:37.973980 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 06 13:19:37 crc kubenswrapper[4867]: I1006 13:19:37.974035 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-hkmxs" Oct 06 13:19:37 crc kubenswrapper[4867]: I1006 13:19:37.974433 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 06 13:19:37 crc kubenswrapper[4867]: I1006 13:19:37.974532 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 06 13:19:37 crc kubenswrapper[4867]: I1006 13:19:37.974914 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 06 13:19:37 crc kubenswrapper[4867]: I1006 13:19:37.975701 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.016074 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.016156 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-config\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.016195 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69cpj\" (UniqueName: \"kubernetes.io/projected/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-kube-api-access-69cpj\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.016221 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.016278 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.016547 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.016749 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.016830 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.118305 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.118426 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.118466 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-config\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.118489 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69cpj\" (UniqueName: \"kubernetes.io/projected/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-kube-api-access-69cpj\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.118514 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.118543 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.118589 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.118645 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.120627 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.125128 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-config\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.125745 4867 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.125774 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8b1bb68adf8576a50f9d1afe1558762f141c90adcfe42ae323643ac07b58f5a8/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.126347 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.126612 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.127025 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.136661 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.138427 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69cpj\" (UniqueName: \"kubernetes.io/projected/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-kube-api-access-69cpj\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.170763 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") pod \"prometheus-metric-storage-0\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:38 crc kubenswrapper[4867]: I1006 13:19:38.291302 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.450834 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-tg8j4"] Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.453090 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.455846 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.455846 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-26x6l" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.456010 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.458632 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-k22cm"] Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.460822 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.463974 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tg8j4"] Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.476387 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-k22cm"] Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.589386 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68750dd5-11c8-4fee-853c-09b68df5aff8-combined-ca-bundle\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.589446 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7478d336-9573-432c-8d73-f7396d652085-var-lib\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.589492 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnn47\" (UniqueName: \"kubernetes.io/projected/7478d336-9573-432c-8d73-f7396d652085-kube-api-access-vnn47\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.589673 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7478d336-9573-432c-8d73-f7396d652085-scripts\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.589718 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68750dd5-11c8-4fee-853c-09b68df5aff8-scripts\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.589736 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfrv6\" (UniqueName: \"kubernetes.io/projected/68750dd5-11c8-4fee-853c-09b68df5aff8-kube-api-access-mfrv6\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.589767 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68750dd5-11c8-4fee-853c-09b68df5aff8-var-log-ovn\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.589791 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7478d336-9573-432c-8d73-f7396d652085-etc-ovs\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.589814 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68750dd5-11c8-4fee-853c-09b68df5aff8-var-run-ovn\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.589834 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7478d336-9573-432c-8d73-f7396d652085-var-run\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.589855 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68750dd5-11c8-4fee-853c-09b68df5aff8-var-run\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.589871 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7478d336-9573-432c-8d73-f7396d652085-var-log\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.589902 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/68750dd5-11c8-4fee-853c-09b68df5aff8-ovn-controller-tls-certs\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.692083 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68750dd5-11c8-4fee-853c-09b68df5aff8-scripts\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.692158 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfrv6\" (UniqueName: \"kubernetes.io/projected/68750dd5-11c8-4fee-853c-09b68df5aff8-kube-api-access-mfrv6\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.692192 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68750dd5-11c8-4fee-853c-09b68df5aff8-var-log-ovn\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.692218 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7478d336-9573-432c-8d73-f7396d652085-etc-ovs\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.692240 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68750dd5-11c8-4fee-853c-09b68df5aff8-var-run-ovn\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.692285 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7478d336-9573-432c-8d73-f7396d652085-var-run\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.692316 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7478d336-9573-432c-8d73-f7396d652085-var-log\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.692331 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68750dd5-11c8-4fee-853c-09b68df5aff8-var-run\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.692363 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/68750dd5-11c8-4fee-853c-09b68df5aff8-ovn-controller-tls-certs\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.692388 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68750dd5-11c8-4fee-853c-09b68df5aff8-combined-ca-bundle\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.692408 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7478d336-9573-432c-8d73-f7396d652085-var-lib\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.692435 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnn47\" (UniqueName: \"kubernetes.io/projected/7478d336-9573-432c-8d73-f7396d652085-kube-api-access-vnn47\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.692464 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7478d336-9573-432c-8d73-f7396d652085-scripts\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.694030 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7478d336-9573-432c-8d73-f7396d652085-var-log\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.694170 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7478d336-9573-432c-8d73-f7396d652085-etc-ovs\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.694515 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68750dd5-11c8-4fee-853c-09b68df5aff8-var-log-ovn\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.694524 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68750dd5-11c8-4fee-853c-09b68df5aff8-var-run-ovn\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.694619 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7478d336-9573-432c-8d73-f7396d652085-var-run\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.694660 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68750dd5-11c8-4fee-853c-09b68df5aff8-var-run\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.695041 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7478d336-9573-432c-8d73-f7396d652085-var-lib\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.696176 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68750dd5-11c8-4fee-853c-09b68df5aff8-scripts\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.697627 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7478d336-9573-432c-8d73-f7396d652085-scripts\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.702966 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/68750dd5-11c8-4fee-853c-09b68df5aff8-ovn-controller-tls-certs\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.702994 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68750dd5-11c8-4fee-853c-09b68df5aff8-combined-ca-bundle\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.717412 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfrv6\" (UniqueName: \"kubernetes.io/projected/68750dd5-11c8-4fee-853c-09b68df5aff8-kube-api-access-mfrv6\") pod \"ovn-controller-tg8j4\" (UID: \"68750dd5-11c8-4fee-853c-09b68df5aff8\") " pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.720938 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnn47\" (UniqueName: \"kubernetes.io/projected/7478d336-9573-432c-8d73-f7396d652085-kube-api-access-vnn47\") pod \"ovn-controller-ovs-k22cm\" (UID: \"7478d336-9573-432c-8d73-f7396d652085\") " pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.781972 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tg8j4" Oct 06 13:19:40 crc kubenswrapper[4867]: I1006 13:19:40.811124 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.306046 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.307858 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.312038 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.312668 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.312848 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-vfng4" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.313150 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.314291 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.333628 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.434394 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8d27ae1-8b6d-4a9d-b302-a354673be3be-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.434748 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8d27ae1-8b6d-4a9d-b302-a354673be3be-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.434776 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.434797 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d27ae1-8b6d-4a9d-b302-a354673be3be-config\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.434829 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d27ae1-8b6d-4a9d-b302-a354673be3be-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.434901 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss9kl\" (UniqueName: \"kubernetes.io/projected/b8d27ae1-8b6d-4a9d-b302-a354673be3be-kube-api-access-ss9kl\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.434952 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8d27ae1-8b6d-4a9d-b302-a354673be3be-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.434989 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8d27ae1-8b6d-4a9d-b302-a354673be3be-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.537083 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8d27ae1-8b6d-4a9d-b302-a354673be3be-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.537312 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.537386 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d27ae1-8b6d-4a9d-b302-a354673be3be-config\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.537422 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d27ae1-8b6d-4a9d-b302-a354673be3be-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.537455 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss9kl\" (UniqueName: \"kubernetes.io/projected/b8d27ae1-8b6d-4a9d-b302-a354673be3be-kube-api-access-ss9kl\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.537493 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8d27ae1-8b6d-4a9d-b302-a354673be3be-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.537517 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8d27ae1-8b6d-4a9d-b302-a354673be3be-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.537567 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8d27ae1-8b6d-4a9d-b302-a354673be3be-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.537774 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.538665 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d27ae1-8b6d-4a9d-b302-a354673be3be-config\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.539258 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8d27ae1-8b6d-4a9d-b302-a354673be3be-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.539374 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8d27ae1-8b6d-4a9d-b302-a354673be3be-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.544104 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d27ae1-8b6d-4a9d-b302-a354673be3be-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.545854 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8d27ae1-8b6d-4a9d-b302-a354673be3be-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.554452 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8d27ae1-8b6d-4a9d-b302-a354673be3be-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.557887 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss9kl\" (UniqueName: \"kubernetes.io/projected/b8d27ae1-8b6d-4a9d-b302-a354673be3be-kube-api-access-ss9kl\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.574866 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b8d27ae1-8b6d-4a9d-b302-a354673be3be\") " pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:42 crc kubenswrapper[4867]: I1006 13:19:42.631523 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 13:19:43 crc kubenswrapper[4867]: E1006 13:19:43.333896 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.151:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 06 13:19:43 crc kubenswrapper[4867]: E1006 13:19:43.334981 4867 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.151:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 06 13:19:43 crc kubenswrapper[4867]: E1006 13:19:43.335281 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.151:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9q2zr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-64ff4bc6cc-gmv4z_openstack(6b621b65-c24b-4a46-96b9-19939ada50e0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 13:19:43 crc kubenswrapper[4867]: E1006 13:19:43.336440 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" podUID="6b621b65-c24b-4a46-96b9-19939ada50e0" Oct 06 13:19:43 crc kubenswrapper[4867]: E1006 13:19:43.474804 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.151:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 06 13:19:43 crc kubenswrapper[4867]: E1006 13:19:43.474915 4867 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.151:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 06 13:19:43 crc kubenswrapper[4867]: E1006 13:19:43.475107 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.151:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lnp6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6697f74bb9-x98l4_openstack(f3b4b5c6-a702-4fae-9d27-d09bdc486da8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 13:19:43 crc kubenswrapper[4867]: E1006 13:19:43.476193 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6697f74bb9-x98l4" podUID="f3b4b5c6-a702-4fae-9d27-d09bdc486da8" Oct 06 13:19:43 crc kubenswrapper[4867]: I1006 13:19:43.881865 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.380920 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6697f74bb9-x98l4" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.389899 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.486016 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b621b65-c24b-4a46-96b9-19939ada50e0-config\") pod \"6b621b65-c24b-4a46-96b9-19939ada50e0\" (UID: \"6b621b65-c24b-4a46-96b9-19939ada50e0\") " Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.486105 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b621b65-c24b-4a46-96b9-19939ada50e0-dns-svc\") pod \"6b621b65-c24b-4a46-96b9-19939ada50e0\" (UID: \"6b621b65-c24b-4a46-96b9-19939ada50e0\") " Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.486169 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3b4b5c6-a702-4fae-9d27-d09bdc486da8-config\") pod \"f3b4b5c6-a702-4fae-9d27-d09bdc486da8\" (UID: \"f3b4b5c6-a702-4fae-9d27-d09bdc486da8\") " Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.486280 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnp6l\" (UniqueName: \"kubernetes.io/projected/f3b4b5c6-a702-4fae-9d27-d09bdc486da8-kube-api-access-lnp6l\") pod \"f3b4b5c6-a702-4fae-9d27-d09bdc486da8\" (UID: \"f3b4b5c6-a702-4fae-9d27-d09bdc486da8\") " Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.486415 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q2zr\" (UniqueName: \"kubernetes.io/projected/6b621b65-c24b-4a46-96b9-19939ada50e0-kube-api-access-9q2zr\") pod \"6b621b65-c24b-4a46-96b9-19939ada50e0\" (UID: \"6b621b65-c24b-4a46-96b9-19939ada50e0\") " Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.488369 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b621b65-c24b-4a46-96b9-19939ada50e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b621b65-c24b-4a46-96b9-19939ada50e0" (UID: "6b621b65-c24b-4a46-96b9-19939ada50e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.488565 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3b4b5c6-a702-4fae-9d27-d09bdc486da8-config" (OuterVolumeSpecName: "config") pod "f3b4b5c6-a702-4fae-9d27-d09bdc486da8" (UID: "f3b4b5c6-a702-4fae-9d27-d09bdc486da8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.488696 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b621b65-c24b-4a46-96b9-19939ada50e0-config" (OuterVolumeSpecName: "config") pod "6b621b65-c24b-4a46-96b9-19939ada50e0" (UID: "6b621b65-c24b-4a46-96b9-19939ada50e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.494105 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3b4b5c6-a702-4fae-9d27-d09bdc486da8-kube-api-access-lnp6l" (OuterVolumeSpecName: "kube-api-access-lnp6l") pod "f3b4b5c6-a702-4fae-9d27-d09bdc486da8" (UID: "f3b4b5c6-a702-4fae-9d27-d09bdc486da8"). InnerVolumeSpecName "kube-api-access-lnp6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.494812 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b621b65-c24b-4a46-96b9-19939ada50e0-kube-api-access-9q2zr" (OuterVolumeSpecName: "kube-api-access-9q2zr") pod "6b621b65-c24b-4a46-96b9-19939ada50e0" (UID: "6b621b65-c24b-4a46-96b9-19939ada50e0"). InnerVolumeSpecName "kube-api-access-9q2zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.589085 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q2zr\" (UniqueName: \"kubernetes.io/projected/6b621b65-c24b-4a46-96b9-19939ada50e0-kube-api-access-9q2zr\") on node \"crc\" DevicePath \"\"" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.589125 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b621b65-c24b-4a46-96b9-19939ada50e0-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.589136 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b621b65-c24b-4a46-96b9-19939ada50e0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.589146 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3b4b5c6-a702-4fae-9d27-d09bdc486da8-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.589154 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnp6l\" (UniqueName: \"kubernetes.io/projected/f3b4b5c6-a702-4fae-9d27-d09bdc486da8-kube-api-access-lnp6l\") on node \"crc\" DevicePath \"\"" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.727486 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9f249bfb-ab86-491d-9d1c-b3930fdea27d","Type":"ContainerStarted","Data":"95afb45fcc88dfad0aa8c21882c027abe8b7fa9d313b2829d338b3ecdf624847"} Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.728809 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.728849 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64ff4bc6cc-gmv4z" event={"ID":"6b621b65-c24b-4a46-96b9-19939ada50e0","Type":"ContainerDied","Data":"fa3d24486d31d14ed51c144bd48bf7c455a13ddb7e3f1121a21e90003d87395c"} Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.729810 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6697f74bb9-x98l4" event={"ID":"f3b4b5c6-a702-4fae-9d27-d09bdc486da8","Type":"ContainerDied","Data":"623beae4fcda638170413ee29db0a3a8cff5076873242cf9b39ab507a413deb4"} Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.729838 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6697f74bb9-x98l4" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.803504 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6697f74bb9-x98l4"] Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.809921 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6697f74bb9-x98l4"] Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.831300 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64ff4bc6cc-gmv4z"] Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.840531 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64ff4bc6cc-gmv4z"] Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.855349 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.857670 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.861213 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.861799 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.862238 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-8h2rc" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.863364 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.876451 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.894746 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18420b8b-345a-41e6-b753-6766143362a3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.894796 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwkwb\" (UniqueName: \"kubernetes.io/projected/18420b8b-345a-41e6-b753-6766143362a3-kube-api-access-dwkwb\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.894854 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18420b8b-345a-41e6-b753-6766143362a3-config\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.894876 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.894896 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18420b8b-345a-41e6-b753-6766143362a3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.894918 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/18420b8b-345a-41e6-b753-6766143362a3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.894957 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/18420b8b-345a-41e6-b753-6766143362a3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.895063 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18420b8b-345a-41e6-b753-6766143362a3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.966584 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.994547 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 13:19:44 crc kubenswrapper[4867]: I1006 13:19:44.998479 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18420b8b-345a-41e6-b753-6766143362a3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:44.998675 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18420b8b-345a-41e6-b753-6766143362a3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:44.999304 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwkwb\" (UniqueName: \"kubernetes.io/projected/18420b8b-345a-41e6-b753-6766143362a3-kube-api-access-dwkwb\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:44.999581 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18420b8b-345a-41e6-b753-6766143362a3-config\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:44.999651 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:44.999714 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18420b8b-345a-41e6-b753-6766143362a3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:44.999759 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/18420b8b-345a-41e6-b753-6766143362a3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:44.999903 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/18420b8b-345a-41e6-b753-6766143362a3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.000495 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.015093 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18420b8b-345a-41e6-b753-6766143362a3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.019662 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/18420b8b-345a-41e6-b753-6766143362a3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.020489 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18420b8b-345a-41e6-b753-6766143362a3-config\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.023720 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18420b8b-345a-41e6-b753-6766143362a3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.030139 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/18420b8b-345a-41e6-b753-6766143362a3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.032100 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18420b8b-345a-41e6-b753-6766143362a3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.037676 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-786b66f8cc-8vwnt"] Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.059358 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.086394 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.089948 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.098430 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwkwb\" (UniqueName: \"kubernetes.io/projected/18420b8b-345a-41e6-b753-6766143362a3-kube-api-access-dwkwb\") pod \"ovsdbserver-sb-0\" (UID: \"18420b8b-345a-41e6-b753-6766143362a3\") " pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: W1006 13:19:45.112838 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacd7f8b9_810f_4e76_b971_c466bf7d4a5b.slice/crio-e5088680bb2cbc26e4df1b53ba9ec79152a37c9f2acbb99888ad7bdc7995ec7e WatchSource:0}: Error finding container e5088680bb2cbc26e4df1b53ba9ec79152a37c9f2acbb99888ad7bdc7995ec7e: Status 404 returned error can't find the container with id e5088680bb2cbc26e4df1b53ba9ec79152a37c9f2acbb99888ad7bdc7995ec7e Oct 06 13:19:45 crc kubenswrapper[4867]: W1006 13:19:45.115012 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68750dd5_11c8_4fee_853c_09b68df5aff8.slice/crio-8557d7477d4ea926d7270cdfea88f7e05c9e6b0dd90de44b0d4c206878528126 WatchSource:0}: Error finding container 8557d7477d4ea926d7270cdfea88f7e05c9e6b0dd90de44b0d4c206878528126: Status 404 returned error can't find the container with id 8557d7477d4ea926d7270cdfea88f7e05c9e6b0dd90de44b0d4c206878528126 Oct 06 13:19:45 crc kubenswrapper[4867]: W1006 13:19:45.124823 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0017037b_73b8_4ff2_ad33_4f3cdfeb68b4.slice/crio-5894e20960f41365f9cfbf0d46aa4396067216e0913cd9e1adb6776ccb5920cc WatchSource:0}: Error finding container 5894e20960f41365f9cfbf0d46aa4396067216e0913cd9e1adb6776ccb5920cc: Status 404 returned error can't find the container with id 5894e20960f41365f9cfbf0d46aa4396067216e0913cd9e1adb6776ccb5920cc Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.126614 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.163084 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.187496 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.199013 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c44d66bd9-sfd9p"] Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.284956 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b621b65-c24b-4a46-96b9-19939ada50e0" path="/var/lib/kubelet/pods/6b621b65-c24b-4a46-96b9-19939ada50e0/volumes" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.285733 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3b4b5c6-a702-4fae-9d27-d09bdc486da8" path="/var/lib/kubelet/pods/f3b4b5c6-a702-4fae-9d27-d09bdc486da8/volumes" Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.286127 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.286160 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tg8j4"] Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.286169 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d858c69c-6g9hk"] Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.286179 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-k22cm"] Oct 06 13:19:45 crc kubenswrapper[4867]: W1006 13:19:45.330523 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7478d336_9573_432c_8d73_f7396d652085.slice/crio-e83646c77741029a70db0ff142fd49847ed5871b1abe89f2d3afcf5ed5d0344c WatchSource:0}: Error finding container e83646c77741029a70db0ff142fd49847ed5871b1abe89f2d3afcf5ed5d0344c: Status 404 returned error can't find the container with id e83646c77741029a70db0ff142fd49847ed5871b1abe89f2d3afcf5ed5d0344c Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.349306 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 13:19:45 crc kubenswrapper[4867]: W1006 13:19:45.419218 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8d27ae1_8b6d_4a9d_b302_a354673be3be.slice/crio-888f7cd5990b0f3cdc70b5ca991489fdc67dbd01f56350ce97e50b4b7c96f8f5 WatchSource:0}: Error finding container 888f7cd5990b0f3cdc70b5ca991489fdc67dbd01f56350ce97e50b4b7c96f8f5: Status 404 returned error can't find the container with id 888f7cd5990b0f3cdc70b5ca991489fdc67dbd01f56350ce97e50b4b7c96f8f5 Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.760282 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"40e8af9c-90c3-4d15-b8c8-c7b35447bf17","Type":"ContainerStarted","Data":"e720d98932a253c4a5be335430a965eed6314e5fc8cc94d8297d6be268b3eea6"} Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.764537 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" event={"ID":"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4","Type":"ContainerStarted","Data":"5894e20960f41365f9cfbf0d46aa4396067216e0913cd9e1adb6776ccb5920cc"} Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.773210 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"469e79f5-1d34-4151-ae0b-81301742c10c","Type":"ContainerStarted","Data":"0a1c38f4628d485fa378ca9cdfea567c0ab3ac7fa6c60d75b5b9e35db5cae979"} Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.778678 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"acd2b7ce-fe29-4b71-b730-7b1212f4416d","Type":"ContainerStarted","Data":"19651825ce118a919bd2302796f8e4fbd765e224db8b9ee4f0efa2d0ae55fa1e"} Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.782049 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tg8j4" event={"ID":"68750dd5-11c8-4fee-853c-09b68df5aff8","Type":"ContainerStarted","Data":"8557d7477d4ea926d7270cdfea88f7e05c9e6b0dd90de44b0d4c206878528126"} Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.784759 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"acd7f8b9-810f-4e76-b971-c466bf7d4a5b","Type":"ContainerStarted","Data":"e5088680bb2cbc26e4df1b53ba9ec79152a37c9f2acbb99888ad7bdc7995ec7e"} Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.788880 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k22cm" event={"ID":"7478d336-9573-432c-8d73-f7396d652085","Type":"ContainerStarted","Data":"e83646c77741029a70db0ff142fd49847ed5871b1abe89f2d3afcf5ed5d0344c"} Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.810451 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ec109351-f578-4141-8193-44f6433880b3","Type":"ContainerStarted","Data":"37e5a94c0cd97e79cdafccda27a140a893d7ecd317e12db93d96cc8b65bc3061"} Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.815848 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd","Type":"ContainerStarted","Data":"72bd4e9b36729f7985e821cedca4e8b59ddfd2a1408f9cd6ff440831de573b83"} Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.821197 4867 generic.go:334] "Generic (PLEG): container finished" podID="34ae7d6f-ca2c-4d51-a216-d0e8cecd8441" containerID="f9c937b505f447e2ff67198069d1ab03904e51711bfa88dbb18b7437198401be" exitCode=0 Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.821286 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" event={"ID":"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441","Type":"ContainerDied","Data":"f9c937b505f447e2ff67198069d1ab03904e51711bfa88dbb18b7437198401be"} Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.821305 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" event={"ID":"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441","Type":"ContainerStarted","Data":"5208616f11ea22fdc38b76d44568189bbfa6c9ec7979d99322e896ef246cd4b3"} Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.827448 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b8d27ae1-8b6d-4a9d-b302-a354673be3be","Type":"ContainerStarted","Data":"888f7cd5990b0f3cdc70b5ca991489fdc67dbd01f56350ce97e50b4b7c96f8f5"} Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.829443 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"4beec03b-3d57-4c36-a149-153bb022bd7a","Type":"ContainerStarted","Data":"6fcd950b47388f53e71c14ea2c2892a9748216cf4ba2d9e376bcd169deb2b979"} Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.833439 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" event={"ID":"dd3db137-4c43-4e44-abb9-707d0a322393","Type":"ContainerStarted","Data":"eda7cfe6aa482bd8c0161a3eaa3e68c521e70ad93ec595afc776dfa17137897b"} Oct 06 13:19:45 crc kubenswrapper[4867]: I1006 13:19:45.942368 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 13:19:46 crc kubenswrapper[4867]: I1006 13:19:46.852906 4867 generic.go:334] "Generic (PLEG): container finished" podID="dd3db137-4c43-4e44-abb9-707d0a322393" containerID="3eb0e7510dad99a8ee564b607f70e474108741e14e0b2890e8859d17f1389bac" exitCode=0 Oct 06 13:19:46 crc kubenswrapper[4867]: I1006 13:19:46.852941 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" event={"ID":"dd3db137-4c43-4e44-abb9-707d0a322393","Type":"ContainerDied","Data":"3eb0e7510dad99a8ee564b607f70e474108741e14e0b2890e8859d17f1389bac"} Oct 06 13:19:46 crc kubenswrapper[4867]: I1006 13:19:46.859556 4867 generic.go:334] "Generic (PLEG): container finished" podID="0017037b-73b8-4ff2-ad33-4f3cdfeb68b4" containerID="597a479a3cb886a14d2e7909f3c6b700f6a988b0c7e14aa3b60bbad10d6ed2a9" exitCode=0 Oct 06 13:19:46 crc kubenswrapper[4867]: I1006 13:19:46.859599 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" event={"ID":"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4","Type":"ContainerDied","Data":"597a479a3cb886a14d2e7909f3c6b700f6a988b0c7e14aa3b60bbad10d6ed2a9"} Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.025855 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6nfld"] Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.028473 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.031954 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.034648 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6nfld"] Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.038778 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-ovn-rundir\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.038918 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.038956 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-config\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.039046 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-combined-ca-bundle\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.039105 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gthjw\" (UniqueName: \"kubernetes.io/projected/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-kube-api-access-gthjw\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.039437 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-ovs-rundir\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.142419 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-ovn-rundir\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.142878 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.142912 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-config\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.142971 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-combined-ca-bundle\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.143010 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gthjw\" (UniqueName: \"kubernetes.io/projected/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-kube-api-access-gthjw\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.143048 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-ovs-rundir\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.143073 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-ovn-rundir\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.143216 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-ovs-rundir\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.144217 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-config\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.152628 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.153841 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-combined-ca-bundle\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.167456 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gthjw\" (UniqueName: \"kubernetes.io/projected/cbe16793-d6a8-4aa9-b509-3f3b710b70e3-kube-api-access-gthjw\") pod \"ovn-controller-metrics-6nfld\" (UID: \"cbe16793-d6a8-4aa9-b509-3f3b710b70e3\") " pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.207948 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d858c69c-6g9hk"] Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.262825 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bcf488c49-2kmfs"] Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.271734 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bcf488c49-2kmfs"] Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.271899 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.276646 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.356328 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-config\") pod \"dnsmasq-dns-6bcf488c49-2kmfs\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.356385 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-dns-svc\") pod \"dnsmasq-dns-6bcf488c49-2kmfs\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.356412 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-ovsdbserver-sb\") pod \"dnsmasq-dns-6bcf488c49-2kmfs\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.357264 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlbhb\" (UniqueName: \"kubernetes.io/projected/8773a802-8117-4ed5-a3fd-a8898833fb11-kube-api-access-nlbhb\") pod \"dnsmasq-dns-6bcf488c49-2kmfs\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.368815 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c44d66bd9-sfd9p"] Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.385894 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785fb6df79-8gqlr"] Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.410566 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6nfld" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.421373 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.428470 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.446430 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785fb6df79-8gqlr"] Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.461329 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlbhb\" (UniqueName: \"kubernetes.io/projected/8773a802-8117-4ed5-a3fd-a8898833fb11-kube-api-access-nlbhb\") pod \"dnsmasq-dns-6bcf488c49-2kmfs\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.461438 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-ovsdbserver-nb\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.461522 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-config\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.461589 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg6ck\" (UniqueName: \"kubernetes.io/projected/b149ebae-e3d9-4e9a-8d34-511370e9612b-kube-api-access-lg6ck\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.461634 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-ovsdbserver-sb\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.461695 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-config\") pod \"dnsmasq-dns-6bcf488c49-2kmfs\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.461720 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-dns-svc\") pod \"dnsmasq-dns-6bcf488c49-2kmfs\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.461739 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-dns-svc\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.461763 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-ovsdbserver-sb\") pod \"dnsmasq-dns-6bcf488c49-2kmfs\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.462690 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-ovsdbserver-sb\") pod \"dnsmasq-dns-6bcf488c49-2kmfs\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.463354 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-config\") pod \"dnsmasq-dns-6bcf488c49-2kmfs\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.463953 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-dns-svc\") pod \"dnsmasq-dns-6bcf488c49-2kmfs\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.494233 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlbhb\" (UniqueName: \"kubernetes.io/projected/8773a802-8117-4ed5-a3fd-a8898833fb11-kube-api-access-nlbhb\") pod \"dnsmasq-dns-6bcf488c49-2kmfs\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.563755 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-config\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.563849 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg6ck\" (UniqueName: \"kubernetes.io/projected/b149ebae-e3d9-4e9a-8d34-511370e9612b-kube-api-access-lg6ck\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.563876 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-ovsdbserver-sb\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.563923 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-dns-svc\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.563982 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-ovsdbserver-nb\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.564865 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-ovsdbserver-nb\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.565511 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-config\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.567573 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-dns-svc\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.568059 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-ovsdbserver-sb\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.584490 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg6ck\" (UniqueName: \"kubernetes.io/projected/b149ebae-e3d9-4e9a-8d34-511370e9612b-kube-api-access-lg6ck\") pod \"dnsmasq-dns-785fb6df79-8gqlr\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.609236 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:19:47 crc kubenswrapper[4867]: I1006 13:19:47.773755 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:19:48 crc kubenswrapper[4867]: W1006 13:19:48.967966 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18420b8b_345a_41e6_b753_6766143362a3.slice/crio-0dd96fd21550f8f4b69dbe5d249b62fa082b71f821c0ae698a00e99a70c25891 WatchSource:0}: Error finding container 0dd96fd21550f8f4b69dbe5d249b62fa082b71f821c0ae698a00e99a70c25891: Status 404 returned error can't find the container with id 0dd96fd21550f8f4b69dbe5d249b62fa082b71f821c0ae698a00e99a70c25891 Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.047701 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.100364 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-config\") pod \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\" (UID: \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\") " Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.100925 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-dns-svc\") pod \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\" (UID: \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\") " Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.100989 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srd9g\" (UniqueName: \"kubernetes.io/projected/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-kube-api-access-srd9g\") pod \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\" (UID: \"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441\") " Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.107766 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-kube-api-access-srd9g" (OuterVolumeSpecName: "kube-api-access-srd9g") pod "34ae7d6f-ca2c-4d51-a216-d0e8cecd8441" (UID: "34ae7d6f-ca2c-4d51-a216-d0e8cecd8441"). InnerVolumeSpecName "kube-api-access-srd9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.122401 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-config" (OuterVolumeSpecName: "config") pod "34ae7d6f-ca2c-4d51-a216-d0e8cecd8441" (UID: "34ae7d6f-ca2c-4d51-a216-d0e8cecd8441"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.139661 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34ae7d6f-ca2c-4d51-a216-d0e8cecd8441" (UID: "34ae7d6f-ca2c-4d51-a216-d0e8cecd8441"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.204174 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.204219 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.204233 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srd9g\" (UniqueName: \"kubernetes.io/projected/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441-kube-api-access-srd9g\") on node \"crc\" DevicePath \"\"" Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.895432 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"18420b8b-345a-41e6-b753-6766143362a3","Type":"ContainerStarted","Data":"0dd96fd21550f8f4b69dbe5d249b62fa082b71f821c0ae698a00e99a70c25891"} Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.898028 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" event={"ID":"34ae7d6f-ca2c-4d51-a216-d0e8cecd8441","Type":"ContainerDied","Data":"5208616f11ea22fdc38b76d44568189bbfa6c9ec7979d99322e896ef246cd4b3"} Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.898087 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-786b66f8cc-8vwnt" Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.898096 4867 scope.go:117] "RemoveContainer" containerID="f9c937b505f447e2ff67198069d1ab03904e51711bfa88dbb18b7437198401be" Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.946202 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-786b66f8cc-8vwnt"] Oct 06 13:19:49 crc kubenswrapper[4867]: I1006 13:19:49.952890 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-786b66f8cc-8vwnt"] Oct 06 13:19:51 crc kubenswrapper[4867]: I1006 13:19:51.238453 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ae7d6f-ca2c-4d51-a216-d0e8cecd8441" path="/var/lib/kubelet/pods/34ae7d6f-ca2c-4d51-a216-d0e8cecd8441/volumes" Oct 06 13:19:53 crc kubenswrapper[4867]: E1006 13:19:53.395065 4867 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 06 13:19:53 crc kubenswrapper[4867]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/dd3db137-4c43-4e44-abb9-707d0a322393/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 06 13:19:53 crc kubenswrapper[4867]: > podSandboxID="eda7cfe6aa482bd8c0161a3eaa3e68c521e70ad93ec595afc776dfa17137897b" Oct 06 13:19:53 crc kubenswrapper[4867]: E1006 13:19:53.395867 4867 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 06 13:19:53 crc kubenswrapper[4867]: container &Container{Name:dnsmasq-dns,Image:38.102.83.151:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jx8ml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86d858c69c-6g9hk_openstack(dd3db137-4c43-4e44-abb9-707d0a322393): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/dd3db137-4c43-4e44-abb9-707d0a322393/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 06 13:19:53 crc kubenswrapper[4867]: > logger="UnhandledError" Oct 06 13:19:53 crc kubenswrapper[4867]: E1006 13:19:53.397616 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/dd3db137-4c43-4e44-abb9-707d0a322393/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" podUID="dd3db137-4c43-4e44-abb9-707d0a322393" Oct 06 13:19:54 crc kubenswrapper[4867]: I1006 13:19:54.698318 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" Oct 06 13:19:54 crc kubenswrapper[4867]: I1006 13:19:54.838119 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd3db137-4c43-4e44-abb9-707d0a322393-config\") pod \"dd3db137-4c43-4e44-abb9-707d0a322393\" (UID: \"dd3db137-4c43-4e44-abb9-707d0a322393\") " Oct 06 13:19:54 crc kubenswrapper[4867]: I1006 13:19:54.838172 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd3db137-4c43-4e44-abb9-707d0a322393-dns-svc\") pod \"dd3db137-4c43-4e44-abb9-707d0a322393\" (UID: \"dd3db137-4c43-4e44-abb9-707d0a322393\") " Oct 06 13:19:54 crc kubenswrapper[4867]: I1006 13:19:54.838746 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx8ml\" (UniqueName: \"kubernetes.io/projected/dd3db137-4c43-4e44-abb9-707d0a322393-kube-api-access-jx8ml\") pod \"dd3db137-4c43-4e44-abb9-707d0a322393\" (UID: \"dd3db137-4c43-4e44-abb9-707d0a322393\") " Oct 06 13:19:54 crc kubenswrapper[4867]: I1006 13:19:54.843985 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3db137-4c43-4e44-abb9-707d0a322393-kube-api-access-jx8ml" (OuterVolumeSpecName: "kube-api-access-jx8ml") pod "dd3db137-4c43-4e44-abb9-707d0a322393" (UID: "dd3db137-4c43-4e44-abb9-707d0a322393"). InnerVolumeSpecName "kube-api-access-jx8ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:19:54 crc kubenswrapper[4867]: I1006 13:19:54.875862 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd3db137-4c43-4e44-abb9-707d0a322393-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd3db137-4c43-4e44-abb9-707d0a322393" (UID: "dd3db137-4c43-4e44-abb9-707d0a322393"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:19:54 crc kubenswrapper[4867]: I1006 13:19:54.877721 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd3db137-4c43-4e44-abb9-707d0a322393-config" (OuterVolumeSpecName: "config") pod "dd3db137-4c43-4e44-abb9-707d0a322393" (UID: "dd3db137-4c43-4e44-abb9-707d0a322393"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:19:54 crc kubenswrapper[4867]: I1006 13:19:54.940985 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx8ml\" (UniqueName: \"kubernetes.io/projected/dd3db137-4c43-4e44-abb9-707d0a322393-kube-api-access-jx8ml\") on node \"crc\" DevicePath \"\"" Oct 06 13:19:54 crc kubenswrapper[4867]: I1006 13:19:54.941029 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd3db137-4c43-4e44-abb9-707d0a322393-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:19:54 crc kubenswrapper[4867]: I1006 13:19:54.941042 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd3db137-4c43-4e44-abb9-707d0a322393-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:19:54 crc kubenswrapper[4867]: I1006 13:19:54.941822 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" Oct 06 13:19:54 crc kubenswrapper[4867]: I1006 13:19:54.941821 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d858c69c-6g9hk" event={"ID":"dd3db137-4c43-4e44-abb9-707d0a322393","Type":"ContainerDied","Data":"eda7cfe6aa482bd8c0161a3eaa3e68c521e70ad93ec595afc776dfa17137897b"} Oct 06 13:19:54 crc kubenswrapper[4867]: I1006 13:19:54.941898 4867 scope.go:117] "RemoveContainer" containerID="3eb0e7510dad99a8ee564b607f70e474108741e14e0b2890e8859d17f1389bac" Oct 06 13:19:55 crc kubenswrapper[4867]: I1006 13:19:55.013273 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d858c69c-6g9hk"] Oct 06 13:19:55 crc kubenswrapper[4867]: I1006 13:19:55.020178 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86d858c69c-6g9hk"] Oct 06 13:19:55 crc kubenswrapper[4867]: I1006 13:19:55.204162 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bcf488c49-2kmfs"] Oct 06 13:19:55 crc kubenswrapper[4867]: I1006 13:19:55.213546 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6nfld"] Oct 06 13:19:55 crc kubenswrapper[4867]: I1006 13:19:55.219871 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785fb6df79-8gqlr"] Oct 06 13:19:55 crc kubenswrapper[4867]: I1006 13:19:55.232812 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd3db137-4c43-4e44-abb9-707d0a322393" path="/var/lib/kubelet/pods/dd3db137-4c43-4e44-abb9-707d0a322393/volumes" Oct 06 13:19:55 crc kubenswrapper[4867]: W1006 13:19:55.511217 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbe16793_d6a8_4aa9_b509_3f3b710b70e3.slice/crio-8be99743fccb2554a9be6f839a9d0aa910655444d6f3f3ae73afc66167250308 WatchSource:0}: Error finding container 8be99743fccb2554a9be6f839a9d0aa910655444d6f3f3ae73afc66167250308: Status 404 returned error can't find the container with id 8be99743fccb2554a9be6f839a9d0aa910655444d6f3f3ae73afc66167250308 Oct 06 13:19:55 crc kubenswrapper[4867]: W1006 13:19:55.530931 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8773a802_8117_4ed5_a3fd_a8898833fb11.slice/crio-94d877e397eb7b7fa2e965efee653fc09d2b117b905210a35e0662d1a64842be WatchSource:0}: Error finding container 94d877e397eb7b7fa2e965efee653fc09d2b117b905210a35e0662d1a64842be: Status 404 returned error can't find the container with id 94d877e397eb7b7fa2e965efee653fc09d2b117b905210a35e0662d1a64842be Oct 06 13:19:55 crc kubenswrapper[4867]: W1006 13:19:55.532615 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb149ebae_e3d9_4e9a_8d34_511370e9612b.slice/crio-a800c5210674b3ca7c9322afd1e3f7f13ae5c5d16951d96416d9e7d4ef0cbde2 WatchSource:0}: Error finding container a800c5210674b3ca7c9322afd1e3f7f13ae5c5d16951d96416d9e7d4ef0cbde2: Status 404 returned error can't find the container with id a800c5210674b3ca7c9322afd1e3f7f13ae5c5d16951d96416d9e7d4ef0cbde2 Oct 06 13:19:55 crc kubenswrapper[4867]: I1006 13:19:55.953222 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ec109351-f578-4141-8193-44f6433880b3","Type":"ContainerStarted","Data":"5c899fdb22d2939ab897749789c07c1142d5477b92b57eed862b9ea52769632e"} Oct 06 13:19:55 crc kubenswrapper[4867]: I1006 13:19:55.954707 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" event={"ID":"8773a802-8117-4ed5-a3fd-a8898833fb11","Type":"ContainerStarted","Data":"94d877e397eb7b7fa2e965efee653fc09d2b117b905210a35e0662d1a64842be"} Oct 06 13:19:55 crc kubenswrapper[4867]: I1006 13:19:55.956210 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6nfld" event={"ID":"cbe16793-d6a8-4aa9-b509-3f3b710b70e3","Type":"ContainerStarted","Data":"8be99743fccb2554a9be6f839a9d0aa910655444d6f3f3ae73afc66167250308"} Oct 06 13:19:55 crc kubenswrapper[4867]: I1006 13:19:55.959722 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" event={"ID":"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4","Type":"ContainerStarted","Data":"a135d028414175f5944be7207f89535aa7d79d348a7d48149be7c1dc3feb0221"} Oct 06 13:19:55 crc kubenswrapper[4867]: I1006 13:19:55.959829 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" podUID="0017037b-73b8-4ff2-ad33-4f3cdfeb68b4" containerName="dnsmasq-dns" containerID="cri-o://a135d028414175f5944be7207f89535aa7d79d348a7d48149be7c1dc3feb0221" gracePeriod=10 Oct 06 13:19:55 crc kubenswrapper[4867]: I1006 13:19:55.960104 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" Oct 06 13:19:55 crc kubenswrapper[4867]: I1006 13:19:55.963223 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" event={"ID":"b149ebae-e3d9-4e9a-8d34-511370e9612b","Type":"ContainerStarted","Data":"a800c5210674b3ca7c9322afd1e3f7f13ae5c5d16951d96416d9e7d4ef0cbde2"} Oct 06 13:19:56 crc kubenswrapper[4867]: I1006 13:19:55.999970 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" podStartSLOduration=25.999940436 podStartE2EDuration="25.999940436s" podCreationTimestamp="2025-10-06 13:19:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:19:55.999543015 +0000 UTC m=+975.457491159" watchObservedRunningTime="2025-10-06 13:19:55.999940436 +0000 UTC m=+975.457888580" Oct 06 13:19:56 crc kubenswrapper[4867]: I1006 13:19:56.974466 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"40e8af9c-90c3-4d15-b8c8-c7b35447bf17","Type":"ContainerStarted","Data":"39b5c5b26fa1a7b00d8e8523793ca96e9ed27db3efc5201bcd4e94e4f3c44028"} Oct 06 13:19:56 crc kubenswrapper[4867]: I1006 13:19:56.974852 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 06 13:19:56 crc kubenswrapper[4867]: I1006 13:19:56.978025 4867 generic.go:334] "Generic (PLEG): container finished" podID="0017037b-73b8-4ff2-ad33-4f3cdfeb68b4" containerID="a135d028414175f5944be7207f89535aa7d79d348a7d48149be7c1dc3feb0221" exitCode=0 Oct 06 13:19:56 crc kubenswrapper[4867]: I1006 13:19:56.978068 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" event={"ID":"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4","Type":"ContainerDied","Data":"a135d028414175f5944be7207f89535aa7d79d348a7d48149be7c1dc3feb0221"} Oct 06 13:19:56 crc kubenswrapper[4867]: I1006 13:19:56.978120 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" event={"ID":"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4","Type":"ContainerDied","Data":"5894e20960f41365f9cfbf0d46aa4396067216e0913cd9e1adb6776ccb5920cc"} Oct 06 13:19:56 crc kubenswrapper[4867]: I1006 13:19:56.978132 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5894e20960f41365f9cfbf0d46aa4396067216e0913cd9e1adb6776ccb5920cc" Oct 06 13:19:56 crc kubenswrapper[4867]: I1006 13:19:56.980542 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9f249bfb-ab86-491d-9d1c-b3930fdea27d","Type":"ContainerStarted","Data":"ed05e3e744e02f905cc03b57571a1f3fbc2df72ee0c2ceefc3ed6b911570636c"} Oct 06 13:19:56 crc kubenswrapper[4867]: I1006 13:19:56.994028 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.715222458 podStartE2EDuration="22.994008367s" podCreationTimestamp="2025-10-06 13:19:34 +0000 UTC" firstStartedPulling="2025-10-06 13:19:45.107855804 +0000 UTC m=+964.565803948" lastFinishedPulling="2025-10-06 13:19:54.386641713 +0000 UTC m=+973.844589857" observedRunningTime="2025-10-06 13:19:56.993698448 +0000 UTC m=+976.451646602" watchObservedRunningTime="2025-10-06 13:19:56.994008367 +0000 UTC m=+976.451956511" Oct 06 13:19:58 crc kubenswrapper[4867]: I1006 13:19:57.814211 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" Oct 06 13:19:58 crc kubenswrapper[4867]: I1006 13:19:57.895456 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-dns-svc\") pod \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\" (UID: \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\") " Oct 06 13:19:58 crc kubenswrapper[4867]: I1006 13:19:57.895502 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-config\") pod \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\" (UID: \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\") " Oct 06 13:19:58 crc kubenswrapper[4867]: I1006 13:19:57.895523 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq9l4\" (UniqueName: \"kubernetes.io/projected/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-kube-api-access-gq9l4\") pod \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\" (UID: \"0017037b-73b8-4ff2-ad33-4f3cdfeb68b4\") " Oct 06 13:19:58 crc kubenswrapper[4867]: I1006 13:19:57.919620 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-kube-api-access-gq9l4" (OuterVolumeSpecName: "kube-api-access-gq9l4") pod "0017037b-73b8-4ff2-ad33-4f3cdfeb68b4" (UID: "0017037b-73b8-4ff2-ad33-4f3cdfeb68b4"). InnerVolumeSpecName "kube-api-access-gq9l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:19:58 crc kubenswrapper[4867]: I1006 13:19:57.941370 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-config" (OuterVolumeSpecName: "config") pod "0017037b-73b8-4ff2-ad33-4f3cdfeb68b4" (UID: "0017037b-73b8-4ff2-ad33-4f3cdfeb68b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:19:58 crc kubenswrapper[4867]: I1006 13:19:57.946900 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0017037b-73b8-4ff2-ad33-4f3cdfeb68b4" (UID: "0017037b-73b8-4ff2-ad33-4f3cdfeb68b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:19:58 crc kubenswrapper[4867]: I1006 13:19:57.993240 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k22cm" event={"ID":"7478d336-9573-432c-8d73-f7396d652085","Type":"ContainerStarted","Data":"f29ee56036d9e1ab4e55501d38f6d367a396b441b1f4d2591d9d8948b35b5da5"} Oct 06 13:19:58 crc kubenswrapper[4867]: I1006 13:19:57.993426 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c44d66bd9-sfd9p" Oct 06 13:19:58 crc kubenswrapper[4867]: I1006 13:19:57.997175 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:19:58 crc kubenswrapper[4867]: I1006 13:19:57.997214 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:19:58 crc kubenswrapper[4867]: I1006 13:19:57.997230 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq9l4\" (UniqueName: \"kubernetes.io/projected/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4-kube-api-access-gq9l4\") on node \"crc\" DevicePath \"\"" Oct 06 13:19:58 crc kubenswrapper[4867]: I1006 13:19:58.033445 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c44d66bd9-sfd9p"] Oct 06 13:19:58 crc kubenswrapper[4867]: I1006 13:19:58.041633 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c44d66bd9-sfd9p"] Oct 06 13:19:59 crc kubenswrapper[4867]: I1006 13:19:59.232511 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0017037b-73b8-4ff2-ad33-4f3cdfeb68b4" path="/var/lib/kubelet/pods/0017037b-73b8-4ff2-ad33-4f3cdfeb68b4/volumes" Oct 06 13:20:01 crc kubenswrapper[4867]: I1006 13:20:01.027938 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd","Type":"ContainerStarted","Data":"c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4"} Oct 06 13:20:01 crc kubenswrapper[4867]: I1006 13:20:01.037075 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"18420b8b-345a-41e6-b753-6766143362a3","Type":"ContainerStarted","Data":"2374b9fc75bbae0a55d6d6b899321adcbfd13b47e6bde15cb74fbb10a6557f8f"} Oct 06 13:20:01 crc kubenswrapper[4867]: I1006 13:20:01.039303 4867 generic.go:334] "Generic (PLEG): container finished" podID="8773a802-8117-4ed5-a3fd-a8898833fb11" containerID="51bcb1036ab8b4eb0b712a71b741e70c20e4d99580f6bdb05a5ed6e36f891519" exitCode=0 Oct 06 13:20:01 crc kubenswrapper[4867]: I1006 13:20:01.039369 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" event={"ID":"8773a802-8117-4ed5-a3fd-a8898833fb11","Type":"ContainerDied","Data":"51bcb1036ab8b4eb0b712a71b741e70c20e4d99580f6bdb05a5ed6e36f891519"} Oct 06 13:20:01 crc kubenswrapper[4867]: I1006 13:20:01.047295 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b8d27ae1-8b6d-4a9d-b302-a354673be3be","Type":"ContainerStarted","Data":"920e0272ada9c1028bb5efd66c92c220c932f5d89bfb446fb0b9c92bd0768691"} Oct 06 13:20:01 crc kubenswrapper[4867]: I1006 13:20:01.050223 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"4beec03b-3d57-4c36-a149-153bb022bd7a","Type":"ContainerStarted","Data":"d9a9433273d2f2fb162c39c0ba1cfae30efc9f88f8074ac3aba59f371dc84a67"} Oct 06 13:20:01 crc kubenswrapper[4867]: I1006 13:20:01.061529 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"acd7f8b9-810f-4e76-b971-c466bf7d4a5b","Type":"ContainerStarted","Data":"20db4aef3d1f6f5beead11f0cb913ab6379935aac4adfe39811cc35e1f27258f"} Oct 06 13:20:01 crc kubenswrapper[4867]: I1006 13:20:01.068187 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"acd2b7ce-fe29-4b71-b730-7b1212f4416d","Type":"ContainerStarted","Data":"cbf9aab93f2d13b75ab9a5064978555f328531fde10508d110221b4310c94a74"} Oct 06 13:20:01 crc kubenswrapper[4867]: I1006 13:20:01.073874 4867 generic.go:334] "Generic (PLEG): container finished" podID="7478d336-9573-432c-8d73-f7396d652085" containerID="f29ee56036d9e1ab4e55501d38f6d367a396b441b1f4d2591d9d8948b35b5da5" exitCode=0 Oct 06 13:20:01 crc kubenswrapper[4867]: I1006 13:20:01.073954 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k22cm" event={"ID":"7478d336-9573-432c-8d73-f7396d652085","Type":"ContainerDied","Data":"f29ee56036d9e1ab4e55501d38f6d367a396b441b1f4d2591d9d8948b35b5da5"} Oct 06 13:20:01 crc kubenswrapper[4867]: I1006 13:20:01.076460 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tg8j4" event={"ID":"68750dd5-11c8-4fee-853c-09b68df5aff8","Type":"ContainerStarted","Data":"5536ef722cb03d8fe0a2c5904382849aba14422a14178cdb8fd2be805ed97422"} Oct 06 13:20:01 crc kubenswrapper[4867]: I1006 13:20:01.076871 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-tg8j4" Oct 06 13:20:01 crc kubenswrapper[4867]: I1006 13:20:01.175563 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-tg8j4" podStartSLOduration=11.474974379 podStartE2EDuration="21.175536691s" podCreationTimestamp="2025-10-06 13:19:40 +0000 UTC" firstStartedPulling="2025-10-06 13:19:45.121058795 +0000 UTC m=+964.579006939" lastFinishedPulling="2025-10-06 13:19:54.821621107 +0000 UTC m=+974.279569251" observedRunningTime="2025-10-06 13:20:01.175301864 +0000 UTC m=+980.633250018" watchObservedRunningTime="2025-10-06 13:20:01.175536691 +0000 UTC m=+980.633484835" Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.106613 4867 generic.go:334] "Generic (PLEG): container finished" podID="b149ebae-e3d9-4e9a-8d34-511370e9612b" containerID="67364fa1d2f7303a62af9bbf289f80b2d7d0fb86643fb1175e978e52aaede0de" exitCode=0 Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.107283 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" event={"ID":"b149ebae-e3d9-4e9a-8d34-511370e9612b","Type":"ContainerDied","Data":"67364fa1d2f7303a62af9bbf289f80b2d7d0fb86643fb1175e978e52aaede0de"} Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.109999 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k22cm" event={"ID":"7478d336-9573-432c-8d73-f7396d652085","Type":"ContainerStarted","Data":"e94385b91375b7730ff5a1422f92c1f51967ae93bbee4767261cc6cf05bc33a0"} Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.122013 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"18420b8b-345a-41e6-b753-6766143362a3","Type":"ContainerStarted","Data":"be125454da2b2a7998d76cfe3ed51cebeb0b947ea143c91dfb0f73f76b543666"} Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.131009 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" event={"ID":"8773a802-8117-4ed5-a3fd-a8898833fb11","Type":"ContainerStarted","Data":"90c2bf0097bf3e243923396bd3d683401d6f091d93dd15a6b5dfba37e59bf64c"} Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.131673 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.138351 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b8d27ae1-8b6d-4a9d-b302-a354673be3be","Type":"ContainerStarted","Data":"21a2b047a16e8898f5002635514947967a1bb03b411e21a5557560700e5346ca"} Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.144736 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6nfld" event={"ID":"cbe16793-d6a8-4aa9-b509-3f3b710b70e3","Type":"ContainerStarted","Data":"f635a2ba5ac180eee49e288bbfbcee3a8ff34d6d8a2615a3a737802487b1a0e0"} Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.148629 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"469e79f5-1d34-4151-ae0b-81301742c10c","Type":"ContainerStarted","Data":"718eb30cf43a0ebe2a9c1cade0cd6ce4f16a1a0abf514b09678ed88ccfe2febe"} Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.151144 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.162994 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.475352104 podStartE2EDuration="19.16297347s" podCreationTimestamp="2025-10-06 13:19:43 +0000 UTC" firstStartedPulling="2025-10-06 13:19:48.998800434 +0000 UTC m=+968.456748618" lastFinishedPulling="2025-10-06 13:20:01.68642184 +0000 UTC m=+981.144369984" observedRunningTime="2025-10-06 13:20:02.155412103 +0000 UTC m=+981.613360257" watchObservedRunningTime="2025-10-06 13:20:02.16297347 +0000 UTC m=+981.620921614" Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.211994 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6nfld" podStartSLOduration=10.04544853 podStartE2EDuration="16.21196899s" podCreationTimestamp="2025-10-06 13:19:46 +0000 UTC" firstStartedPulling="2025-10-06 13:19:55.516475676 +0000 UTC m=+974.974423820" lastFinishedPulling="2025-10-06 13:20:01.682996136 +0000 UTC m=+981.140944280" observedRunningTime="2025-10-06 13:20:02.176033367 +0000 UTC m=+981.633981531" watchObservedRunningTime="2025-10-06 13:20:02.21196899 +0000 UTC m=+981.669917134" Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.218568 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.792095363 podStartE2EDuration="21.21855571s" podCreationTimestamp="2025-10-06 13:19:41 +0000 UTC" firstStartedPulling="2025-10-06 13:19:45.425492929 +0000 UTC m=+964.883441073" lastFinishedPulling="2025-10-06 13:20:01.851953276 +0000 UTC m=+981.309901420" observedRunningTime="2025-10-06 13:20:02.211929619 +0000 UTC m=+981.669877773" watchObservedRunningTime="2025-10-06 13:20:02.21855571 +0000 UTC m=+981.676503854" Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.242180 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" podStartSLOduration=15.242150135 podStartE2EDuration="15.242150135s" podCreationTimestamp="2025-10-06 13:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:20:02.233432147 +0000 UTC m=+981.691380291" watchObservedRunningTime="2025-10-06 13:20:02.242150135 +0000 UTC m=+981.700098279" Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.260446 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.490594196 podStartE2EDuration="26.260419515s" podCreationTimestamp="2025-10-06 13:19:36 +0000 UTC" firstStartedPulling="2025-10-06 13:19:45.03055217 +0000 UTC m=+964.488500324" lastFinishedPulling="2025-10-06 13:19:55.800377489 +0000 UTC m=+975.258325643" observedRunningTime="2025-10-06 13:20:02.248648553 +0000 UTC m=+981.706596697" watchObservedRunningTime="2025-10-06 13:20:02.260419515 +0000 UTC m=+981.718367659" Oct 06 13:20:02 crc kubenswrapper[4867]: I1006 13:20:02.631935 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 06 13:20:03 crc kubenswrapper[4867]: I1006 13:20:03.159125 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" event={"ID":"b149ebae-e3d9-4e9a-8d34-511370e9612b","Type":"ContainerStarted","Data":"f7098bb740c419e5655bde2589014e328075a14c47353b8d513e7717d00eeea1"} Oct 06 13:20:03 crc kubenswrapper[4867]: I1006 13:20:03.159446 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:20:03 crc kubenswrapper[4867]: I1006 13:20:03.161979 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k22cm" event={"ID":"7478d336-9573-432c-8d73-f7396d652085","Type":"ContainerStarted","Data":"66dd36d7f8d000edd6b005b5856bf08ac7bf2680d4096349d4a4a28dce7cbe8c"} Oct 06 13:20:03 crc kubenswrapper[4867]: I1006 13:20:03.162261 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:20:03 crc kubenswrapper[4867]: I1006 13:20:03.163115 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:20:03 crc kubenswrapper[4867]: I1006 13:20:03.178670 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" podStartSLOduration=16.178649942 podStartE2EDuration="16.178649942s" podCreationTimestamp="2025-10-06 13:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:20:03.177232403 +0000 UTC m=+982.635180547" watchObservedRunningTime="2025-10-06 13:20:03.178649942 +0000 UTC m=+982.636598086" Oct 06 13:20:03 crc kubenswrapper[4867]: I1006 13:20:03.188533 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 06 13:20:03 crc kubenswrapper[4867]: I1006 13:20:03.207046 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-k22cm" podStartSLOduration=13.943569497 podStartE2EDuration="23.207021608s" podCreationTimestamp="2025-10-06 13:19:40 +0000 UTC" firstStartedPulling="2025-10-06 13:19:45.410003675 +0000 UTC m=+964.867951819" lastFinishedPulling="2025-10-06 13:19:54.673455786 +0000 UTC m=+974.131403930" observedRunningTime="2025-10-06 13:20:03.20089428 +0000 UTC m=+982.658842424" watchObservedRunningTime="2025-10-06 13:20:03.207021608 +0000 UTC m=+982.664969752" Oct 06 13:20:03 crc kubenswrapper[4867]: I1006 13:20:03.236142 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 06 13:20:03 crc kubenswrapper[4867]: I1006 13:20:03.632476 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 06 13:20:03 crc kubenswrapper[4867]: I1006 13:20:03.670549 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 06 13:20:04 crc kubenswrapper[4867]: I1006 13:20:04.172479 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.104200 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.184496 4867 generic.go:334] "Generic (PLEG): container finished" podID="ec109351-f578-4141-8193-44f6433880b3" containerID="5c899fdb22d2939ab897749789c07c1142d5477b92b57eed862b9ea52769632e" exitCode=0 Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.184548 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ec109351-f578-4141-8193-44f6433880b3","Type":"ContainerDied","Data":"5c899fdb22d2939ab897749789c07c1142d5477b92b57eed862b9ea52769632e"} Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.258188 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.258264 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.644084 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 06 13:20:05 crc kubenswrapper[4867]: E1006 13:20:05.645394 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0017037b-73b8-4ff2-ad33-4f3cdfeb68b4" containerName="dnsmasq-dns" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.645479 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0017037b-73b8-4ff2-ad33-4f3cdfeb68b4" containerName="dnsmasq-dns" Oct 06 13:20:05 crc kubenswrapper[4867]: E1006 13:20:05.645549 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0017037b-73b8-4ff2-ad33-4f3cdfeb68b4" containerName="init" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.645619 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0017037b-73b8-4ff2-ad33-4f3cdfeb68b4" containerName="init" Oct 06 13:20:05 crc kubenswrapper[4867]: E1006 13:20:05.645689 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ae7d6f-ca2c-4d51-a216-d0e8cecd8441" containerName="init" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.645762 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ae7d6f-ca2c-4d51-a216-d0e8cecd8441" containerName="init" Oct 06 13:20:05 crc kubenswrapper[4867]: E1006 13:20:05.645844 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3db137-4c43-4e44-abb9-707d0a322393" containerName="init" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.645899 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3db137-4c43-4e44-abb9-707d0a322393" containerName="init" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.646121 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ae7d6f-ca2c-4d51-a216-d0e8cecd8441" containerName="init" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.646209 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="0017037b-73b8-4ff2-ad33-4f3cdfeb68b4" containerName="dnsmasq-dns" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.646292 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3db137-4c43-4e44-abb9-707d0a322393" containerName="init" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.647292 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.657601 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.657916 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.658062 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.658183 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-rkt9j" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.669668 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.752016 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.752416 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.752545 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.752674 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klzcx\" (UniqueName: \"kubernetes.io/projected/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-kube-api-access-klzcx\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.752772 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-config\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.752867 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-scripts\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.752993 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.854564 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klzcx\" (UniqueName: \"kubernetes.io/projected/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-kube-api-access-klzcx\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.854614 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-config\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.854637 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-scripts\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.854665 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.854711 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.854781 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.854817 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.855842 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-scripts\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.855880 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-config\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.858218 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.861877 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.862131 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.864742 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.870913 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klzcx\" (UniqueName: \"kubernetes.io/projected/c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad-kube-api-access-klzcx\") pod \"ovn-northd-0\" (UID: \"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad\") " pod="openstack/ovn-northd-0" Oct 06 13:20:05 crc kubenswrapper[4867]: I1006 13:20:05.966903 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 13:20:06 crc kubenswrapper[4867]: I1006 13:20:06.204342 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ec109351-f578-4141-8193-44f6433880b3","Type":"ContainerStarted","Data":"e5248555fd74c4258af7d6c1267966969860562c3a346edfcc852e9f04f0fcf3"} Oct 06 13:20:06 crc kubenswrapper[4867]: I1006 13:20:06.213581 4867 generic.go:334] "Generic (PLEG): container finished" podID="acd2b7ce-fe29-4b71-b730-7b1212f4416d" containerID="cbf9aab93f2d13b75ab9a5064978555f328531fde10508d110221b4310c94a74" exitCode=0 Oct 06 13:20:06 crc kubenswrapper[4867]: I1006 13:20:06.213641 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"acd2b7ce-fe29-4b71-b730-7b1212f4416d","Type":"ContainerDied","Data":"cbf9aab93f2d13b75ab9a5064978555f328531fde10508d110221b4310c94a74"} Oct 06 13:20:06 crc kubenswrapper[4867]: I1006 13:20:06.236935 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=24.015007158 podStartE2EDuration="33.236912133s" podCreationTimestamp="2025-10-06 13:19:33 +0000 UTC" firstStartedPulling="2025-10-06 13:19:45.120642683 +0000 UTC m=+964.578590827" lastFinishedPulling="2025-10-06 13:19:54.342547658 +0000 UTC m=+973.800495802" observedRunningTime="2025-10-06 13:20:06.231606178 +0000 UTC m=+985.689554322" watchObservedRunningTime="2025-10-06 13:20:06.236912133 +0000 UTC m=+985.694860277" Oct 06 13:20:06 crc kubenswrapper[4867]: W1006 13:20:06.449212 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc47c2b04_7fb7_4fb9_bbed_6e9b88cbadad.slice/crio-ac290da29ede2a685ee79eca89194ddb3f2d21905f923a5a7145bc5513ad530c WatchSource:0}: Error finding container ac290da29ede2a685ee79eca89194ddb3f2d21905f923a5a7145bc5513ad530c: Status 404 returned error can't find the container with id ac290da29ede2a685ee79eca89194ddb3f2d21905f923a5a7145bc5513ad530c Oct 06 13:20:06 crc kubenswrapper[4867]: I1006 13:20:06.456352 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 13:20:06 crc kubenswrapper[4867]: I1006 13:20:06.930699 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bcf488c49-2kmfs"] Oct 06 13:20:06 crc kubenswrapper[4867]: I1006 13:20:06.932716 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" podUID="8773a802-8117-4ed5-a3fd-a8898833fb11" containerName="dnsmasq-dns" containerID="cri-o://90c2bf0097bf3e243923396bd3d683401d6f091d93dd15a6b5dfba37e59bf64c" gracePeriod=10 Oct 06 13:20:06 crc kubenswrapper[4867]: I1006 13:20:06.936489 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:20:06 crc kubenswrapper[4867]: I1006 13:20:06.970143 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67fcb78fdc-l7fqs"] Oct 06 13:20:06 crc kubenswrapper[4867]: I1006 13:20:06.971707 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:06 crc kubenswrapper[4867]: I1006 13:20:06.975398 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.001434 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fcb78fdc-l7fqs"] Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.085260 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72lk4\" (UniqueName: \"kubernetes.io/projected/c204022e-4511-4065-a43e-fa8ef02e2768-kube-api-access-72lk4\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.085368 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-config\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.085420 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-ovsdbserver-sb\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.085501 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-dns-svc\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.085550 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-ovsdbserver-nb\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.186794 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-dns-svc\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.186862 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-ovsdbserver-nb\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.186922 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72lk4\" (UniqueName: \"kubernetes.io/projected/c204022e-4511-4065-a43e-fa8ef02e2768-kube-api-access-72lk4\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.186992 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-config\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.187016 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-ovsdbserver-sb\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.187762 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-dns-svc\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.187801 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-ovsdbserver-nb\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.188941 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-ovsdbserver-sb\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.188945 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-config\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.211062 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72lk4\" (UniqueName: \"kubernetes.io/projected/c204022e-4511-4065-a43e-fa8ef02e2768-kube-api-access-72lk4\") pod \"dnsmasq-dns-67fcb78fdc-l7fqs\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.248882 4867 generic.go:334] "Generic (PLEG): container finished" podID="8773a802-8117-4ed5-a3fd-a8898833fb11" containerID="90c2bf0097bf3e243923396bd3d683401d6f091d93dd15a6b5dfba37e59bf64c" exitCode=0 Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.252997 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" event={"ID":"8773a802-8117-4ed5-a3fd-a8898833fb11","Type":"ContainerDied","Data":"90c2bf0097bf3e243923396bd3d683401d6f091d93dd15a6b5dfba37e59bf64c"} Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.262770 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad","Type":"ContainerStarted","Data":"ac290da29ede2a685ee79eca89194ddb3f2d21905f923a5a7145bc5513ad530c"} Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.265981 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"acd2b7ce-fe29-4b71-b730-7b1212f4416d","Type":"ContainerStarted","Data":"7d5492a56e6ee874627d89d9f4163ec460043dc231e20910d7c924cf806d803b"} Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.298302 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.590045882 podStartE2EDuration="34.298279964s" podCreationTimestamp="2025-10-06 13:19:33 +0000 UTC" firstStartedPulling="2025-10-06 13:19:45.112906432 +0000 UTC m=+964.570854566" lastFinishedPulling="2025-10-06 13:19:54.821140504 +0000 UTC m=+974.279088648" observedRunningTime="2025-10-06 13:20:07.295655792 +0000 UTC m=+986.753603936" watchObservedRunningTime="2025-10-06 13:20:07.298279964 +0000 UTC m=+986.756228108" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.349088 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.610178 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" podUID="8773a802-8117-4ed5-a3fd-a8898833fb11" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.776450 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:20:07 crc kubenswrapper[4867]: I1006 13:20:07.875182 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.033984 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-config\") pod \"8773a802-8117-4ed5-a3fd-a8898833fb11\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.034518 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-dns-svc\") pod \"8773a802-8117-4ed5-a3fd-a8898833fb11\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.034719 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlbhb\" (UniqueName: \"kubernetes.io/projected/8773a802-8117-4ed5-a3fd-a8898833fb11-kube-api-access-nlbhb\") pod \"8773a802-8117-4ed5-a3fd-a8898833fb11\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.034788 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-ovsdbserver-sb\") pod \"8773a802-8117-4ed5-a3fd-a8898833fb11\" (UID: \"8773a802-8117-4ed5-a3fd-a8898833fb11\") " Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.039541 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8773a802-8117-4ed5-a3fd-a8898833fb11-kube-api-access-nlbhb" (OuterVolumeSpecName: "kube-api-access-nlbhb") pod "8773a802-8117-4ed5-a3fd-a8898833fb11" (UID: "8773a802-8117-4ed5-a3fd-a8898833fb11"). InnerVolumeSpecName "kube-api-access-nlbhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.071378 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8773a802-8117-4ed5-a3fd-a8898833fb11" (UID: "8773a802-8117-4ed5-a3fd-a8898833fb11"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.072032 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8773a802-8117-4ed5-a3fd-a8898833fb11" (UID: "8773a802-8117-4ed5-a3fd-a8898833fb11"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.074766 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-config" (OuterVolumeSpecName: "config") pod "8773a802-8117-4ed5-a3fd-a8898833fb11" (UID: "8773a802-8117-4ed5-a3fd-a8898833fb11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.109213 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67fcb78fdc-l7fqs"] Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.126619 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 06 13:20:08 crc kubenswrapper[4867]: E1006 13:20:08.127275 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8773a802-8117-4ed5-a3fd-a8898833fb11" containerName="dnsmasq-dns" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.127299 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8773a802-8117-4ed5-a3fd-a8898833fb11" containerName="dnsmasq-dns" Oct 06 13:20:08 crc kubenswrapper[4867]: E1006 13:20:08.127308 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8773a802-8117-4ed5-a3fd-a8898833fb11" containerName="init" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.127318 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8773a802-8117-4ed5-a3fd-a8898833fb11" containerName="init" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.127560 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8773a802-8117-4ed5-a3fd-a8898833fb11" containerName="dnsmasq-dns" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.135117 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.137137 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.137183 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.137193 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8773a802-8117-4ed5-a3fd-a8898833fb11-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.137205 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlbhb\" (UniqueName: \"kubernetes.io/projected/8773a802-8117-4ed5-a3fd-a8898833fb11-kube-api-access-nlbhb\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.139956 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lqq9x" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.140200 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.141404 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.143937 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.157111 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.239396 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2npf\" (UniqueName: \"kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-kube-api-access-c2npf\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.240960 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.241131 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.241298 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/dc7edd17-2d19-4949-8849-9a62cd86e861-cache\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.241488 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/dc7edd17-2d19-4949-8849-9a62cd86e861-lock\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.278023 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" event={"ID":"c204022e-4511-4065-a43e-fa8ef02e2768","Type":"ContainerStarted","Data":"66f86b4f88d8830bb355670e305fdfc2291e0bbe4ab85f5db7e71e45832dd172"} Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.280674 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" event={"ID":"8773a802-8117-4ed5-a3fd-a8898833fb11","Type":"ContainerDied","Data":"94d877e397eb7b7fa2e965efee653fc09d2b117b905210a35e0662d1a64842be"} Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.280715 4867 scope.go:117] "RemoveContainer" containerID="90c2bf0097bf3e243923396bd3d683401d6f091d93dd15a6b5dfba37e59bf64c" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.280882 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bcf488c49-2kmfs" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.285530 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad","Type":"ContainerStarted","Data":"eeb1dd5e2c39448e3997af1dde651e23058e5837a9aaa981c2722ae870ba010d"} Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.285588 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad","Type":"ContainerStarted","Data":"45d8767af0999d7a30a3f7173f6d4931fafa5c76e8bbe540dad2fce16fdf9667"} Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.286358 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.303635 4867 scope.go:117] "RemoveContainer" containerID="51bcb1036ab8b4eb0b712a71b741e70c20e4d99580f6bdb05a5ed6e36f891519" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.317051 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.233609795 podStartE2EDuration="3.317032389s" podCreationTimestamp="2025-10-06 13:20:05 +0000 UTC" firstStartedPulling="2025-10-06 13:20:06.463423997 +0000 UTC m=+985.921372141" lastFinishedPulling="2025-10-06 13:20:07.546846591 +0000 UTC m=+987.004794735" observedRunningTime="2025-10-06 13:20:08.313584305 +0000 UTC m=+987.771532449" watchObservedRunningTime="2025-10-06 13:20:08.317032389 +0000 UTC m=+987.774980533" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.343337 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/dc7edd17-2d19-4949-8849-9a62cd86e861-lock\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.343532 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2npf\" (UniqueName: \"kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-kube-api-access-c2npf\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.343572 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.343612 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.343665 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/dc7edd17-2d19-4949-8849-9a62cd86e861-cache\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.343899 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/dc7edd17-2d19-4949-8849-9a62cd86e861-lock\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.344246 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/dc7edd17-2d19-4949-8849-9a62cd86e861-cache\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: E1006 13:20:08.344543 4867 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 13:20:08 crc kubenswrapper[4867]: E1006 13:20:08.344569 4867 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 13:20:08 crc kubenswrapper[4867]: E1006 13:20:08.344631 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift podName:dc7edd17-2d19-4949-8849-9a62cd86e861 nodeName:}" failed. No retries permitted until 2025-10-06 13:20:08.844604703 +0000 UTC m=+988.302552947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift") pod "swift-storage-0" (UID: "dc7edd17-2d19-4949-8849-9a62cd86e861") : configmap "swift-ring-files" not found Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.344963 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.350601 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bcf488c49-2kmfs"] Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.356160 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bcf488c49-2kmfs"] Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.366532 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2npf\" (UniqueName: \"kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-kube-api-access-c2npf\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.377717 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.697362 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-p9sjp"] Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.709947 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.714662 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.715026 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.715431 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.734101 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-p9sjp"] Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.878866 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.878919 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-dispersionconf\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.878991 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-swiftconf\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.879017 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-combined-ca-bundle\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.879048 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46037a5a-6fcb-48c6-854d-1f4e60534120-etc-swift\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.879085 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46037a5a-6fcb-48c6-854d-1f4e60534120-ring-data-devices\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.879105 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jms\" (UniqueName: \"kubernetes.io/projected/46037a5a-6fcb-48c6-854d-1f4e60534120-kube-api-access-d6jms\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.879125 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46037a5a-6fcb-48c6-854d-1f4e60534120-scripts\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: E1006 13:20:08.879139 4867 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 13:20:08 crc kubenswrapper[4867]: E1006 13:20:08.879174 4867 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 13:20:08 crc kubenswrapper[4867]: E1006 13:20:08.879239 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift podName:dc7edd17-2d19-4949-8849-9a62cd86e861 nodeName:}" failed. No retries permitted until 2025-10-06 13:20:09.879215691 +0000 UTC m=+989.337163825 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift") pod "swift-storage-0" (UID: "dc7edd17-2d19-4949-8849-9a62cd86e861") : configmap "swift-ring-files" not found Oct 06 13:20:08 crc kubenswrapper[4867]: E1006 13:20:08.890557 4867 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.198:54346->38.102.83.198:45409: write tcp 38.102.83.198:54346->38.102.83.198:45409: write: broken pipe Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.981104 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-dispersionconf\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.981198 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-swiftconf\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.981228 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-combined-ca-bundle\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.981273 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46037a5a-6fcb-48c6-854d-1f4e60534120-etc-swift\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.981312 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46037a5a-6fcb-48c6-854d-1f4e60534120-ring-data-devices\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.981332 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6jms\" (UniqueName: \"kubernetes.io/projected/46037a5a-6fcb-48c6-854d-1f4e60534120-kube-api-access-d6jms\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.981349 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46037a5a-6fcb-48c6-854d-1f4e60534120-scripts\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.982168 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46037a5a-6fcb-48c6-854d-1f4e60534120-scripts\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.982515 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46037a5a-6fcb-48c6-854d-1f4e60534120-etc-swift\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.982745 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46037a5a-6fcb-48c6-854d-1f4e60534120-ring-data-devices\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.986163 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-combined-ca-bundle\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.988729 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-dispersionconf\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.991817 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-swiftconf\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:08 crc kubenswrapper[4867]: I1006 13:20:08.998297 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6jms\" (UniqueName: \"kubernetes.io/projected/46037a5a-6fcb-48c6-854d-1f4e60534120-kube-api-access-d6jms\") pod \"swift-ring-rebalance-p9sjp\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:09 crc kubenswrapper[4867]: I1006 13:20:09.050009 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:09 crc kubenswrapper[4867]: I1006 13:20:09.263284 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8773a802-8117-4ed5-a3fd-a8898833fb11" path="/var/lib/kubelet/pods/8773a802-8117-4ed5-a3fd-a8898833fb11/volumes" Oct 06 13:20:09 crc kubenswrapper[4867]: I1006 13:20:09.301082 4867 generic.go:334] "Generic (PLEG): container finished" podID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerID="c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4" exitCode=0 Oct 06 13:20:09 crc kubenswrapper[4867]: I1006 13:20:09.301181 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd","Type":"ContainerDied","Data":"c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4"} Oct 06 13:20:09 crc kubenswrapper[4867]: I1006 13:20:09.304307 4867 generic.go:334] "Generic (PLEG): container finished" podID="c204022e-4511-4065-a43e-fa8ef02e2768" containerID="a4337dfc6f67a65ee0ce8a3f62d04fe3fe8008d640f64cc79966fd49415f6605" exitCode=0 Oct 06 13:20:09 crc kubenswrapper[4867]: I1006 13:20:09.304367 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" event={"ID":"c204022e-4511-4065-a43e-fa8ef02e2768","Type":"ContainerDied","Data":"a4337dfc6f67a65ee0ce8a3f62d04fe3fe8008d640f64cc79966fd49415f6605"} Oct 06 13:20:09 crc kubenswrapper[4867]: I1006 13:20:09.597223 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-p9sjp"] Oct 06 13:20:09 crc kubenswrapper[4867]: W1006 13:20:09.599363 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46037a5a_6fcb_48c6_854d_1f4e60534120.slice/crio-a29d503df72f367f1fcdd2182202716242d3e9c4f95bdf45632ca1c91b85929d WatchSource:0}: Error finding container a29d503df72f367f1fcdd2182202716242d3e9c4f95bdf45632ca1c91b85929d: Status 404 returned error can't find the container with id a29d503df72f367f1fcdd2182202716242d3e9c4f95bdf45632ca1c91b85929d Oct 06 13:20:09 crc kubenswrapper[4867]: I1006 13:20:09.904858 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:09 crc kubenswrapper[4867]: E1006 13:20:09.905153 4867 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 13:20:09 crc kubenswrapper[4867]: E1006 13:20:09.905328 4867 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 13:20:09 crc kubenswrapper[4867]: E1006 13:20:09.905435 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift podName:dc7edd17-2d19-4949-8849-9a62cd86e861 nodeName:}" failed. No retries permitted until 2025-10-06 13:20:11.90540635 +0000 UTC m=+991.363354494 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift") pod "swift-storage-0" (UID: "dc7edd17-2d19-4949-8849-9a62cd86e861") : configmap "swift-ring-files" not found Oct 06 13:20:10 crc kubenswrapper[4867]: I1006 13:20:10.323414 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p9sjp" event={"ID":"46037a5a-6fcb-48c6-854d-1f4e60534120","Type":"ContainerStarted","Data":"a29d503df72f367f1fcdd2182202716242d3e9c4f95bdf45632ca1c91b85929d"} Oct 06 13:20:10 crc kubenswrapper[4867]: I1006 13:20:10.362758 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" event={"ID":"c204022e-4511-4065-a43e-fa8ef02e2768","Type":"ContainerStarted","Data":"7cf5eecfbca7e8a521c770cf251dd885401a1565a4d8277385605ca5e82978db"} Oct 06 13:20:10 crc kubenswrapper[4867]: I1006 13:20:10.364203 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:10 crc kubenswrapper[4867]: I1006 13:20:10.383444 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" podStartSLOduration=4.38342139 podStartE2EDuration="4.38342139s" podCreationTimestamp="2025-10-06 13:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:20:10.381733204 +0000 UTC m=+989.839681348" watchObservedRunningTime="2025-10-06 13:20:10.38342139 +0000 UTC m=+989.841369534" Oct 06 13:20:11 crc kubenswrapper[4867]: I1006 13:20:11.961080 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:11 crc kubenswrapper[4867]: E1006 13:20:11.961385 4867 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 13:20:11 crc kubenswrapper[4867]: E1006 13:20:11.961672 4867 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 13:20:11 crc kubenswrapper[4867]: E1006 13:20:11.961739 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift podName:dc7edd17-2d19-4949-8849-9a62cd86e861 nodeName:}" failed. No retries permitted until 2025-10-06 13:20:15.961718145 +0000 UTC m=+995.419666289 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift") pod "swift-storage-0" (UID: "dc7edd17-2d19-4949-8849-9a62cd86e861") : configmap "swift-ring-files" not found Oct 06 13:20:12 crc kubenswrapper[4867]: I1006 13:20:12.873696 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:20:12 crc kubenswrapper[4867]: I1006 13:20:12.873756 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:20:14 crc kubenswrapper[4867]: I1006 13:20:14.424603 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 06 13:20:14 crc kubenswrapper[4867]: I1006 13:20:14.425139 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 06 13:20:14 crc kubenswrapper[4867]: I1006 13:20:14.492705 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 06 13:20:14 crc kubenswrapper[4867]: I1006 13:20:14.787100 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 06 13:20:14 crc kubenswrapper[4867]: I1006 13:20:14.787688 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 06 13:20:14 crc kubenswrapper[4867]: I1006 13:20:14.868186 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 06 13:20:15 crc kubenswrapper[4867]: I1006 13:20:15.481163 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 06 13:20:15 crc kubenswrapper[4867]: I1006 13:20:15.502338 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 06 13:20:16 crc kubenswrapper[4867]: I1006 13:20:16.050330 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:16 crc kubenswrapper[4867]: E1006 13:20:16.050649 4867 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 13:20:16 crc kubenswrapper[4867]: E1006 13:20:16.050670 4867 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 13:20:16 crc kubenswrapper[4867]: E1006 13:20:16.050726 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift podName:dc7edd17-2d19-4949-8849-9a62cd86e861 nodeName:}" failed. No retries permitted until 2025-10-06 13:20:24.050708099 +0000 UTC m=+1003.508656243 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift") pod "swift-storage-0" (UID: "dc7edd17-2d19-4949-8849-9a62cd86e861") : configmap "swift-ring-files" not found Oct 06 13:20:17 crc kubenswrapper[4867]: I1006 13:20:17.073713 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-8lllr"] Oct 06 13:20:17 crc kubenswrapper[4867]: I1006 13:20:17.076279 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-8lllr" Oct 06 13:20:17 crc kubenswrapper[4867]: I1006 13:20:17.087806 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-8lllr"] Oct 06 13:20:17 crc kubenswrapper[4867]: I1006 13:20:17.173352 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2gvn\" (UniqueName: \"kubernetes.io/projected/cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e-kube-api-access-b2gvn\") pod \"watcher-db-create-8lllr\" (UID: \"cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e\") " pod="openstack/watcher-db-create-8lllr" Oct 06 13:20:17 crc kubenswrapper[4867]: I1006 13:20:17.275042 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gvn\" (UniqueName: \"kubernetes.io/projected/cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e-kube-api-access-b2gvn\") pod \"watcher-db-create-8lllr\" (UID: \"cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e\") " pod="openstack/watcher-db-create-8lllr" Oct 06 13:20:17 crc kubenswrapper[4867]: I1006 13:20:17.303645 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2gvn\" (UniqueName: \"kubernetes.io/projected/cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e-kube-api-access-b2gvn\") pod \"watcher-db-create-8lllr\" (UID: \"cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e\") " pod="openstack/watcher-db-create-8lllr" Oct 06 13:20:17 crc kubenswrapper[4867]: I1006 13:20:17.353036 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:20:17 crc kubenswrapper[4867]: I1006 13:20:17.393116 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-8lllr" Oct 06 13:20:17 crc kubenswrapper[4867]: I1006 13:20:17.414191 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785fb6df79-8gqlr"] Oct 06 13:20:17 crc kubenswrapper[4867]: I1006 13:20:17.414526 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" podUID="b149ebae-e3d9-4e9a-8d34-511370e9612b" containerName="dnsmasq-dns" containerID="cri-o://f7098bb740c419e5655bde2589014e328075a14c47353b8d513e7717d00eeea1" gracePeriod=10 Oct 06 13:20:17 crc kubenswrapper[4867]: I1006 13:20:17.446081 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd","Type":"ContainerStarted","Data":"74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d"} Oct 06 13:20:17 crc kubenswrapper[4867]: I1006 13:20:17.460298 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p9sjp" event={"ID":"46037a5a-6fcb-48c6-854d-1f4e60534120","Type":"ContainerStarted","Data":"a6b9c79516027d730eb875043a2e5588682b2cfcebc01cae50ef4b6720f82848"} Oct 06 13:20:17 crc kubenswrapper[4867]: I1006 13:20:17.488825 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-p9sjp" podStartSLOduration=2.700208012 podStartE2EDuration="9.488796451s" podCreationTimestamp="2025-10-06 13:20:08 +0000 UTC" firstStartedPulling="2025-10-06 13:20:09.601975753 +0000 UTC m=+989.059923897" lastFinishedPulling="2025-10-06 13:20:16.390564192 +0000 UTC m=+995.848512336" observedRunningTime="2025-10-06 13:20:17.483900987 +0000 UTC m=+996.941849141" watchObservedRunningTime="2025-10-06 13:20:17.488796451 +0000 UTC m=+996.946744615" Oct 06 13:20:17 crc kubenswrapper[4867]: I1006 13:20:17.926576 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-8lllr"] Oct 06 13:20:17 crc kubenswrapper[4867]: W1006 13:20:17.932066 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcec9595d_7c4e_4fab_b85b_b5c3a70aeb6e.slice/crio-f8cac1605cac715e0da14fa413ee1d6b5e6bc2848c62941bc573bebd7bedf5f3 WatchSource:0}: Error finding container f8cac1605cac715e0da14fa413ee1d6b5e6bc2848c62941bc573bebd7bedf5f3: Status 404 returned error can't find the container with id f8cac1605cac715e0da14fa413ee1d6b5e6bc2848c62941bc573bebd7bedf5f3 Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.136298 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.208110 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-config\") pod \"b149ebae-e3d9-4e9a-8d34-511370e9612b\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.208201 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-dns-svc\") pod \"b149ebae-e3d9-4e9a-8d34-511370e9612b\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.208233 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg6ck\" (UniqueName: \"kubernetes.io/projected/b149ebae-e3d9-4e9a-8d34-511370e9612b-kube-api-access-lg6ck\") pod \"b149ebae-e3d9-4e9a-8d34-511370e9612b\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.208303 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-ovsdbserver-sb\") pod \"b149ebae-e3d9-4e9a-8d34-511370e9612b\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.208354 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-ovsdbserver-nb\") pod \"b149ebae-e3d9-4e9a-8d34-511370e9612b\" (UID: \"b149ebae-e3d9-4e9a-8d34-511370e9612b\") " Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.216800 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b149ebae-e3d9-4e9a-8d34-511370e9612b-kube-api-access-lg6ck" (OuterVolumeSpecName: "kube-api-access-lg6ck") pod "b149ebae-e3d9-4e9a-8d34-511370e9612b" (UID: "b149ebae-e3d9-4e9a-8d34-511370e9612b"). InnerVolumeSpecName "kube-api-access-lg6ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.272492 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b149ebae-e3d9-4e9a-8d34-511370e9612b" (UID: "b149ebae-e3d9-4e9a-8d34-511370e9612b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.274205 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b149ebae-e3d9-4e9a-8d34-511370e9612b" (UID: "b149ebae-e3d9-4e9a-8d34-511370e9612b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.275712 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-config" (OuterVolumeSpecName: "config") pod "b149ebae-e3d9-4e9a-8d34-511370e9612b" (UID: "b149ebae-e3d9-4e9a-8d34-511370e9612b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.283064 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b149ebae-e3d9-4e9a-8d34-511370e9612b" (UID: "b149ebae-e3d9-4e9a-8d34-511370e9612b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.311077 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.311117 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.311127 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.311140 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b149ebae-e3d9-4e9a-8d34-511370e9612b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.311151 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg6ck\" (UniqueName: \"kubernetes.io/projected/b149ebae-e3d9-4e9a-8d34-511370e9612b-kube-api-access-lg6ck\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.469042 4867 generic.go:334] "Generic (PLEG): container finished" podID="cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e" containerID="817fbad0acbe08775f673490b1d6553adc454f9a7a6ce3e2ee351d7826bdc7d7" exitCode=0 Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.469126 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-8lllr" event={"ID":"cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e","Type":"ContainerDied","Data":"817fbad0acbe08775f673490b1d6553adc454f9a7a6ce3e2ee351d7826bdc7d7"} Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.469183 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-8lllr" event={"ID":"cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e","Type":"ContainerStarted","Data":"f8cac1605cac715e0da14fa413ee1d6b5e6bc2848c62941bc573bebd7bedf5f3"} Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.471622 4867 generic.go:334] "Generic (PLEG): container finished" podID="b149ebae-e3d9-4e9a-8d34-511370e9612b" containerID="f7098bb740c419e5655bde2589014e328075a14c47353b8d513e7717d00eeea1" exitCode=0 Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.471701 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.471731 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" event={"ID":"b149ebae-e3d9-4e9a-8d34-511370e9612b","Type":"ContainerDied","Data":"f7098bb740c419e5655bde2589014e328075a14c47353b8d513e7717d00eeea1"} Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.471754 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" event={"ID":"b149ebae-e3d9-4e9a-8d34-511370e9612b","Type":"ContainerDied","Data":"a800c5210674b3ca7c9322afd1e3f7f13ae5c5d16951d96416d9e7d4ef0cbde2"} Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.471777 4867 scope.go:117] "RemoveContainer" containerID="f7098bb740c419e5655bde2589014e328075a14c47353b8d513e7717d00eeea1" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.502102 4867 scope.go:117] "RemoveContainer" containerID="67364fa1d2f7303a62af9bbf289f80b2d7d0fb86643fb1175e978e52aaede0de" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.524747 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785fb6df79-8gqlr"] Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.534906 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785fb6df79-8gqlr"] Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.545003 4867 scope.go:117] "RemoveContainer" containerID="f7098bb740c419e5655bde2589014e328075a14c47353b8d513e7717d00eeea1" Oct 06 13:20:18 crc kubenswrapper[4867]: E1006 13:20:18.545949 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7098bb740c419e5655bde2589014e328075a14c47353b8d513e7717d00eeea1\": container with ID starting with f7098bb740c419e5655bde2589014e328075a14c47353b8d513e7717d00eeea1 not found: ID does not exist" containerID="f7098bb740c419e5655bde2589014e328075a14c47353b8d513e7717d00eeea1" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.545985 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7098bb740c419e5655bde2589014e328075a14c47353b8d513e7717d00eeea1"} err="failed to get container status \"f7098bb740c419e5655bde2589014e328075a14c47353b8d513e7717d00eeea1\": rpc error: code = NotFound desc = could not find container \"f7098bb740c419e5655bde2589014e328075a14c47353b8d513e7717d00eeea1\": container with ID starting with f7098bb740c419e5655bde2589014e328075a14c47353b8d513e7717d00eeea1 not found: ID does not exist" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.546009 4867 scope.go:117] "RemoveContainer" containerID="67364fa1d2f7303a62af9bbf289f80b2d7d0fb86643fb1175e978e52aaede0de" Oct 06 13:20:18 crc kubenswrapper[4867]: E1006 13:20:18.546440 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67364fa1d2f7303a62af9bbf289f80b2d7d0fb86643fb1175e978e52aaede0de\": container with ID starting with 67364fa1d2f7303a62af9bbf289f80b2d7d0fb86643fb1175e978e52aaede0de not found: ID does not exist" containerID="67364fa1d2f7303a62af9bbf289f80b2d7d0fb86643fb1175e978e52aaede0de" Oct 06 13:20:18 crc kubenswrapper[4867]: I1006 13:20:18.546506 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67364fa1d2f7303a62af9bbf289f80b2d7d0fb86643fb1175e978e52aaede0de"} err="failed to get container status \"67364fa1d2f7303a62af9bbf289f80b2d7d0fb86643fb1175e978e52aaede0de\": rpc error: code = NotFound desc = could not find container \"67364fa1d2f7303a62af9bbf289f80b2d7d0fb86643fb1175e978e52aaede0de\": container with ID starting with 67364fa1d2f7303a62af9bbf289f80b2d7d0fb86643fb1175e978e52aaede0de not found: ID does not exist" Oct 06 13:20:19 crc kubenswrapper[4867]: I1006 13:20:19.232093 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b149ebae-e3d9-4e9a-8d34-511370e9612b" path="/var/lib/kubelet/pods/b149ebae-e3d9-4e9a-8d34-511370e9612b/volumes" Oct 06 13:20:19 crc kubenswrapper[4867]: I1006 13:20:19.484458 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd","Type":"ContainerStarted","Data":"0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe"} Oct 06 13:20:19 crc kubenswrapper[4867]: I1006 13:20:19.813947 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-8lllr" Oct 06 13:20:19 crc kubenswrapper[4867]: I1006 13:20:19.953170 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2gvn\" (UniqueName: \"kubernetes.io/projected/cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e-kube-api-access-b2gvn\") pod \"cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e\" (UID: \"cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e\") " Oct 06 13:20:19 crc kubenswrapper[4867]: I1006 13:20:19.958845 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e-kube-api-access-b2gvn" (OuterVolumeSpecName: "kube-api-access-b2gvn") pod "cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e" (UID: "cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e"). InnerVolumeSpecName "kube-api-access-b2gvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:20:20 crc kubenswrapper[4867]: I1006 13:20:20.056144 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2gvn\" (UniqueName: \"kubernetes.io/projected/cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e-kube-api-access-b2gvn\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:20 crc kubenswrapper[4867]: I1006 13:20:20.499329 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-8lllr" event={"ID":"cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e","Type":"ContainerDied","Data":"f8cac1605cac715e0da14fa413ee1d6b5e6bc2848c62941bc573bebd7bedf5f3"} Oct 06 13:20:20 crc kubenswrapper[4867]: I1006 13:20:20.499375 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8cac1605cac715e0da14fa413ee1d6b5e6bc2848c62941bc573bebd7bedf5f3" Oct 06 13:20:20 crc kubenswrapper[4867]: I1006 13:20:20.499443 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-8lllr" Oct 06 13:20:21 crc kubenswrapper[4867]: I1006 13:20:21.038099 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 06 13:20:22 crc kubenswrapper[4867]: I1006 13:20:22.775867 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785fb6df79-8gqlr" podUID="b149ebae-e3d9-4e9a-8d34-511370e9612b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: i/o timeout" Oct 06 13:20:23 crc kubenswrapper[4867]: I1006 13:20:23.545076 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd","Type":"ContainerStarted","Data":"49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389"} Oct 06 13:20:23 crc kubenswrapper[4867]: I1006 13:20:23.574622 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=9.872925774 podStartE2EDuration="47.574597125s" podCreationTimestamp="2025-10-06 13:19:36 +0000 UTC" firstStartedPulling="2025-10-06 13:19:45.096593836 +0000 UTC m=+964.554541980" lastFinishedPulling="2025-10-06 13:20:22.798265187 +0000 UTC m=+1002.256213331" observedRunningTime="2025-10-06 13:20:23.569154006 +0000 UTC m=+1003.027102160" watchObservedRunningTime="2025-10-06 13:20:23.574597125 +0000 UTC m=+1003.032545269" Oct 06 13:20:24 crc kubenswrapper[4867]: I1006 13:20:24.056623 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:24 crc kubenswrapper[4867]: E1006 13:20:24.056890 4867 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 13:20:24 crc kubenswrapper[4867]: E1006 13:20:24.056947 4867 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 13:20:24 crc kubenswrapper[4867]: E1006 13:20:24.057052 4867 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift podName:dc7edd17-2d19-4949-8849-9a62cd86e861 nodeName:}" failed. No retries permitted until 2025-10-06 13:20:40.057017606 +0000 UTC m=+1019.514965790 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift") pod "swift-storage-0" (UID: "dc7edd17-2d19-4949-8849-9a62cd86e861") : configmap "swift-ring-files" not found Oct 06 13:20:24 crc kubenswrapper[4867]: I1006 13:20:24.710205 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-2lcbl"] Oct 06 13:20:24 crc kubenswrapper[4867]: E1006 13:20:24.711584 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e" containerName="mariadb-database-create" Oct 06 13:20:24 crc kubenswrapper[4867]: I1006 13:20:24.711684 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e" containerName="mariadb-database-create" Oct 06 13:20:24 crc kubenswrapper[4867]: E1006 13:20:24.711769 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b149ebae-e3d9-4e9a-8d34-511370e9612b" containerName="init" Oct 06 13:20:24 crc kubenswrapper[4867]: I1006 13:20:24.711832 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b149ebae-e3d9-4e9a-8d34-511370e9612b" containerName="init" Oct 06 13:20:24 crc kubenswrapper[4867]: E1006 13:20:24.711916 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b149ebae-e3d9-4e9a-8d34-511370e9612b" containerName="dnsmasq-dns" Oct 06 13:20:24 crc kubenswrapper[4867]: I1006 13:20:24.711971 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b149ebae-e3d9-4e9a-8d34-511370e9612b" containerName="dnsmasq-dns" Oct 06 13:20:24 crc kubenswrapper[4867]: I1006 13:20:24.712182 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b149ebae-e3d9-4e9a-8d34-511370e9612b" containerName="dnsmasq-dns" Oct 06 13:20:24 crc kubenswrapper[4867]: I1006 13:20:24.712266 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e" containerName="mariadb-database-create" Oct 06 13:20:24 crc kubenswrapper[4867]: I1006 13:20:24.712832 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2lcbl"] Oct 06 13:20:24 crc kubenswrapper[4867]: I1006 13:20:24.712975 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2lcbl" Oct 06 13:20:24 crc kubenswrapper[4867]: I1006 13:20:24.773390 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5nqc\" (UniqueName: \"kubernetes.io/projected/dab09423-89f0-4694-a961-9813755dfd88-kube-api-access-q5nqc\") pod \"keystone-db-create-2lcbl\" (UID: \"dab09423-89f0-4694-a961-9813755dfd88\") " pod="openstack/keystone-db-create-2lcbl" Oct 06 13:20:24 crc kubenswrapper[4867]: I1006 13:20:24.875928 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5nqc\" (UniqueName: \"kubernetes.io/projected/dab09423-89f0-4694-a961-9813755dfd88-kube-api-access-q5nqc\") pod \"keystone-db-create-2lcbl\" (UID: \"dab09423-89f0-4694-a961-9813755dfd88\") " pod="openstack/keystone-db-create-2lcbl" Oct 06 13:20:24 crc kubenswrapper[4867]: I1006 13:20:24.897330 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5nqc\" (UniqueName: \"kubernetes.io/projected/dab09423-89f0-4694-a961-9813755dfd88-kube-api-access-q5nqc\") pod \"keystone-db-create-2lcbl\" (UID: \"dab09423-89f0-4694-a961-9813755dfd88\") " pod="openstack/keystone-db-create-2lcbl" Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.031171 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-9rd88"] Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.032856 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9rd88" Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.037753 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2lcbl" Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.045381 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-9rd88"] Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.079274 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckhxs\" (UniqueName: \"kubernetes.io/projected/c4791fb2-ae33-4758-824f-4b6b7ae9b4ea-kube-api-access-ckhxs\") pod \"placement-db-create-9rd88\" (UID: \"c4791fb2-ae33-4758-824f-4b6b7ae9b4ea\") " pod="openstack/placement-db-create-9rd88" Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.181759 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckhxs\" (UniqueName: \"kubernetes.io/projected/c4791fb2-ae33-4758-824f-4b6b7ae9b4ea-kube-api-access-ckhxs\") pod \"placement-db-create-9rd88\" (UID: \"c4791fb2-ae33-4758-824f-4b6b7ae9b4ea\") " pod="openstack/placement-db-create-9rd88" Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.212285 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckhxs\" (UniqueName: \"kubernetes.io/projected/c4791fb2-ae33-4758-824f-4b6b7ae9b4ea-kube-api-access-ckhxs\") pod \"placement-db-create-9rd88\" (UID: \"c4791fb2-ae33-4758-824f-4b6b7ae9b4ea\") " pod="openstack/placement-db-create-9rd88" Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.277516 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-5dtwn"] Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.278819 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5dtwn" Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.287939 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5dtwn"] Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.385599 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nrj7\" (UniqueName: \"kubernetes.io/projected/ad94d2b3-0f12-4bed-82c2-de7289914d0b-kube-api-access-8nrj7\") pod \"glance-db-create-5dtwn\" (UID: \"ad94d2b3-0f12-4bed-82c2-de7289914d0b\") " pod="openstack/glance-db-create-5dtwn" Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.424542 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9rd88" Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.487830 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nrj7\" (UniqueName: \"kubernetes.io/projected/ad94d2b3-0f12-4bed-82c2-de7289914d0b-kube-api-access-8nrj7\") pod \"glance-db-create-5dtwn\" (UID: \"ad94d2b3-0f12-4bed-82c2-de7289914d0b\") " pod="openstack/glance-db-create-5dtwn" Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.511288 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nrj7\" (UniqueName: \"kubernetes.io/projected/ad94d2b3-0f12-4bed-82c2-de7289914d0b-kube-api-access-8nrj7\") pod \"glance-db-create-5dtwn\" (UID: \"ad94d2b3-0f12-4bed-82c2-de7289914d0b\") " pod="openstack/glance-db-create-5dtwn" Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.528506 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2lcbl"] Oct 06 13:20:25 crc kubenswrapper[4867]: W1006 13:20:25.528769 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddab09423_89f0_4694_a961_9813755dfd88.slice/crio-38a0ff88839329fc7f759593f5b45a0eeef76d2003ca86c2bd8a4beb1b6724bb WatchSource:0}: Error finding container 38a0ff88839329fc7f759593f5b45a0eeef76d2003ca86c2bd8a4beb1b6724bb: Status 404 returned error can't find the container with id 38a0ff88839329fc7f759593f5b45a0eeef76d2003ca86c2bd8a4beb1b6724bb Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.573951 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2lcbl" event={"ID":"dab09423-89f0-4694-a961-9813755dfd88","Type":"ContainerStarted","Data":"38a0ff88839329fc7f759593f5b45a0eeef76d2003ca86c2bd8a4beb1b6724bb"} Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.576055 4867 generic.go:334] "Generic (PLEG): container finished" podID="46037a5a-6fcb-48c6-854d-1f4e60534120" containerID="a6b9c79516027d730eb875043a2e5588682b2cfcebc01cae50ef4b6720f82848" exitCode=0 Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.576086 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p9sjp" event={"ID":"46037a5a-6fcb-48c6-854d-1f4e60534120","Type":"ContainerDied","Data":"a6b9c79516027d730eb875043a2e5588682b2cfcebc01cae50ef4b6720f82848"} Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.600314 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5dtwn" Oct 06 13:20:25 crc kubenswrapper[4867]: I1006 13:20:25.874594 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-9rd88"] Oct 06 13:20:25 crc kubenswrapper[4867]: W1006 13:20:25.894815 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4791fb2_ae33_4758_824f_4b6b7ae9b4ea.slice/crio-0aba95572f5785bce8bd0f1116528bf9e2302e8cedb5c9f8639ac37a981588b7 WatchSource:0}: Error finding container 0aba95572f5785bce8bd0f1116528bf9e2302e8cedb5c9f8639ac37a981588b7: Status 404 returned error can't find the container with id 0aba95572f5785bce8bd0f1116528bf9e2302e8cedb5c9f8639ac37a981588b7 Oct 06 13:20:26 crc kubenswrapper[4867]: I1006 13:20:26.201975 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5dtwn"] Oct 06 13:20:26 crc kubenswrapper[4867]: I1006 13:20:26.584361 4867 generic.go:334] "Generic (PLEG): container finished" podID="dab09423-89f0-4694-a961-9813755dfd88" containerID="fbacba88f8c57c1d7a8b103de6196b99f03d3007da949ac0527df4f2828b959d" exitCode=0 Oct 06 13:20:26 crc kubenswrapper[4867]: I1006 13:20:26.584611 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2lcbl" event={"ID":"dab09423-89f0-4694-a961-9813755dfd88","Type":"ContainerDied","Data":"fbacba88f8c57c1d7a8b103de6196b99f03d3007da949ac0527df4f2828b959d"} Oct 06 13:20:26 crc kubenswrapper[4867]: I1006 13:20:26.586290 4867 generic.go:334] "Generic (PLEG): container finished" podID="c4791fb2-ae33-4758-824f-4b6b7ae9b4ea" containerID="c6f47b4b07a2f6782951b47238e8446a47596715c44e9cb00eca67fadc44ca2c" exitCode=0 Oct 06 13:20:26 crc kubenswrapper[4867]: I1006 13:20:26.586410 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9rd88" event={"ID":"c4791fb2-ae33-4758-824f-4b6b7ae9b4ea","Type":"ContainerDied","Data":"c6f47b4b07a2f6782951b47238e8446a47596715c44e9cb00eca67fadc44ca2c"} Oct 06 13:20:26 crc kubenswrapper[4867]: I1006 13:20:26.586448 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9rd88" event={"ID":"c4791fb2-ae33-4758-824f-4b6b7ae9b4ea","Type":"ContainerStarted","Data":"0aba95572f5785bce8bd0f1116528bf9e2302e8cedb5c9f8639ac37a981588b7"} Oct 06 13:20:26 crc kubenswrapper[4867]: I1006 13:20:26.588061 4867 generic.go:334] "Generic (PLEG): container finished" podID="ad94d2b3-0f12-4bed-82c2-de7289914d0b" containerID="d838f394b26fe381fefd50902d60e1553f7687e92be0da21b7afd485d5d4def6" exitCode=0 Oct 06 13:20:26 crc kubenswrapper[4867]: I1006 13:20:26.588158 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5dtwn" event={"ID":"ad94d2b3-0f12-4bed-82c2-de7289914d0b","Type":"ContainerDied","Data":"d838f394b26fe381fefd50902d60e1553f7687e92be0da21b7afd485d5d4def6"} Oct 06 13:20:26 crc kubenswrapper[4867]: I1006 13:20:26.588179 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5dtwn" event={"ID":"ad94d2b3-0f12-4bed-82c2-de7289914d0b","Type":"ContainerStarted","Data":"f035cfd8717f91fa0688873cdf6b8498e46dc36fc8aaf1cf15b25c89b3eccacf"} Oct 06 13:20:26 crc kubenswrapper[4867]: I1006 13:20:26.884044 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.032465 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-swiftconf\") pod \"46037a5a-6fcb-48c6-854d-1f4e60534120\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.032996 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46037a5a-6fcb-48c6-854d-1f4e60534120-ring-data-devices\") pod \"46037a5a-6fcb-48c6-854d-1f4e60534120\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.033053 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46037a5a-6fcb-48c6-854d-1f4e60534120-scripts\") pod \"46037a5a-6fcb-48c6-854d-1f4e60534120\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.033132 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-dispersionconf\") pod \"46037a5a-6fcb-48c6-854d-1f4e60534120\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.033265 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6jms\" (UniqueName: \"kubernetes.io/projected/46037a5a-6fcb-48c6-854d-1f4e60534120-kube-api-access-d6jms\") pod \"46037a5a-6fcb-48c6-854d-1f4e60534120\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.033329 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46037a5a-6fcb-48c6-854d-1f4e60534120-etc-swift\") pod \"46037a5a-6fcb-48c6-854d-1f4e60534120\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.033370 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-combined-ca-bundle\") pod \"46037a5a-6fcb-48c6-854d-1f4e60534120\" (UID: \"46037a5a-6fcb-48c6-854d-1f4e60534120\") " Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.034349 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46037a5a-6fcb-48c6-854d-1f4e60534120-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "46037a5a-6fcb-48c6-854d-1f4e60534120" (UID: "46037a5a-6fcb-48c6-854d-1f4e60534120"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.034404 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46037a5a-6fcb-48c6-854d-1f4e60534120-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "46037a5a-6fcb-48c6-854d-1f4e60534120" (UID: "46037a5a-6fcb-48c6-854d-1f4e60534120"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.039343 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46037a5a-6fcb-48c6-854d-1f4e60534120-kube-api-access-d6jms" (OuterVolumeSpecName: "kube-api-access-d6jms") pod "46037a5a-6fcb-48c6-854d-1f4e60534120" (UID: "46037a5a-6fcb-48c6-854d-1f4e60534120"). InnerVolumeSpecName "kube-api-access-d6jms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.042458 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "46037a5a-6fcb-48c6-854d-1f4e60534120" (UID: "46037a5a-6fcb-48c6-854d-1f4e60534120"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.048146 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-e19f-account-create-f8h75"] Oct 06 13:20:27 crc kubenswrapper[4867]: E1006 13:20:27.049068 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46037a5a-6fcb-48c6-854d-1f4e60534120" containerName="swift-ring-rebalance" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.049149 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="46037a5a-6fcb-48c6-854d-1f4e60534120" containerName="swift-ring-rebalance" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.049412 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="46037a5a-6fcb-48c6-854d-1f4e60534120" containerName="swift-ring-rebalance" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.050135 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-e19f-account-create-f8h75" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.053015 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-e19f-account-create-f8h75"] Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.053967 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.075064 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46037a5a-6fcb-48c6-854d-1f4e60534120" (UID: "46037a5a-6fcb-48c6-854d-1f4e60534120"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.080836 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46037a5a-6fcb-48c6-854d-1f4e60534120-scripts" (OuterVolumeSpecName: "scripts") pod "46037a5a-6fcb-48c6-854d-1f4e60534120" (UID: "46037a5a-6fcb-48c6-854d-1f4e60534120"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.093218 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "46037a5a-6fcb-48c6-854d-1f4e60534120" (UID: "46037a5a-6fcb-48c6-854d-1f4e60534120"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.135471 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlwg7\" (UniqueName: \"kubernetes.io/projected/19b8e9b5-da69-4679-a7e9-471cfcfa7d92-kube-api-access-dlwg7\") pod \"watcher-e19f-account-create-f8h75\" (UID: \"19b8e9b5-da69-4679-a7e9-471cfcfa7d92\") " pod="openstack/watcher-e19f-account-create-f8h75" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.135919 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.135956 4867 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.135967 4867 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46037a5a-6fcb-48c6-854d-1f4e60534120-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.135976 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46037a5a-6fcb-48c6-854d-1f4e60534120-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.135985 4867 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46037a5a-6fcb-48c6-854d-1f4e60534120-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.135993 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6jms\" (UniqueName: \"kubernetes.io/projected/46037a5a-6fcb-48c6-854d-1f4e60534120-kube-api-access-d6jms\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.136003 4867 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46037a5a-6fcb-48c6-854d-1f4e60534120-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.237134 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlwg7\" (UniqueName: \"kubernetes.io/projected/19b8e9b5-da69-4679-a7e9-471cfcfa7d92-kube-api-access-dlwg7\") pod \"watcher-e19f-account-create-f8h75\" (UID: \"19b8e9b5-da69-4679-a7e9-471cfcfa7d92\") " pod="openstack/watcher-e19f-account-create-f8h75" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.254985 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlwg7\" (UniqueName: \"kubernetes.io/projected/19b8e9b5-da69-4679-a7e9-471cfcfa7d92-kube-api-access-dlwg7\") pod \"watcher-e19f-account-create-f8h75\" (UID: \"19b8e9b5-da69-4679-a7e9-471cfcfa7d92\") " pod="openstack/watcher-e19f-account-create-f8h75" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.470184 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-e19f-account-create-f8h75" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.601831 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p9sjp" event={"ID":"46037a5a-6fcb-48c6-854d-1f4e60534120","Type":"ContainerDied","Data":"a29d503df72f367f1fcdd2182202716242d3e9c4f95bdf45632ca1c91b85929d"} Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.602213 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a29d503df72f367f1fcdd2182202716242d3e9c4f95bdf45632ca1c91b85929d" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.601928 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p9sjp" Oct 06 13:20:27 crc kubenswrapper[4867]: I1006 13:20:27.944114 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-e19f-account-create-f8h75"] Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.109011 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2lcbl" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.114593 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9rd88" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.150402 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5dtwn" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.162334 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckhxs\" (UniqueName: \"kubernetes.io/projected/c4791fb2-ae33-4758-824f-4b6b7ae9b4ea-kube-api-access-ckhxs\") pod \"c4791fb2-ae33-4758-824f-4b6b7ae9b4ea\" (UID: \"c4791fb2-ae33-4758-824f-4b6b7ae9b4ea\") " Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.162458 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5nqc\" (UniqueName: \"kubernetes.io/projected/dab09423-89f0-4694-a961-9813755dfd88-kube-api-access-q5nqc\") pod \"dab09423-89f0-4694-a961-9813755dfd88\" (UID: \"dab09423-89f0-4694-a961-9813755dfd88\") " Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.184735 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab09423-89f0-4694-a961-9813755dfd88-kube-api-access-q5nqc" (OuterVolumeSpecName: "kube-api-access-q5nqc") pod "dab09423-89f0-4694-a961-9813755dfd88" (UID: "dab09423-89f0-4694-a961-9813755dfd88"). InnerVolumeSpecName "kube-api-access-q5nqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.189688 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4791fb2-ae33-4758-824f-4b6b7ae9b4ea-kube-api-access-ckhxs" (OuterVolumeSpecName: "kube-api-access-ckhxs") pod "c4791fb2-ae33-4758-824f-4b6b7ae9b4ea" (UID: "c4791fb2-ae33-4758-824f-4b6b7ae9b4ea"). InnerVolumeSpecName "kube-api-access-ckhxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.264456 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nrj7\" (UniqueName: \"kubernetes.io/projected/ad94d2b3-0f12-4bed-82c2-de7289914d0b-kube-api-access-8nrj7\") pod \"ad94d2b3-0f12-4bed-82c2-de7289914d0b\" (UID: \"ad94d2b3-0f12-4bed-82c2-de7289914d0b\") " Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.264937 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckhxs\" (UniqueName: \"kubernetes.io/projected/c4791fb2-ae33-4758-824f-4b6b7ae9b4ea-kube-api-access-ckhxs\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.264953 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5nqc\" (UniqueName: \"kubernetes.io/projected/dab09423-89f0-4694-a961-9813755dfd88-kube-api-access-q5nqc\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.267440 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad94d2b3-0f12-4bed-82c2-de7289914d0b-kube-api-access-8nrj7" (OuterVolumeSpecName: "kube-api-access-8nrj7") pod "ad94d2b3-0f12-4bed-82c2-de7289914d0b" (UID: "ad94d2b3-0f12-4bed-82c2-de7289914d0b"). InnerVolumeSpecName "kube-api-access-8nrj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.292713 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.367438 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nrj7\" (UniqueName: \"kubernetes.io/projected/ad94d2b3-0f12-4bed-82c2-de7289914d0b-kube-api-access-8nrj7\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.613631 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5dtwn" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.613667 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5dtwn" event={"ID":"ad94d2b3-0f12-4bed-82c2-de7289914d0b","Type":"ContainerDied","Data":"f035cfd8717f91fa0688873cdf6b8498e46dc36fc8aaf1cf15b25c89b3eccacf"} Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.613746 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f035cfd8717f91fa0688873cdf6b8498e46dc36fc8aaf1cf15b25c89b3eccacf" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.616121 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2lcbl" event={"ID":"dab09423-89f0-4694-a961-9813755dfd88","Type":"ContainerDied","Data":"38a0ff88839329fc7f759593f5b45a0eeef76d2003ca86c2bd8a4beb1b6724bb"} Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.616197 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38a0ff88839329fc7f759593f5b45a0eeef76d2003ca86c2bd8a4beb1b6724bb" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.616574 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2lcbl" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.626655 4867 generic.go:334] "Generic (PLEG): container finished" podID="19b8e9b5-da69-4679-a7e9-471cfcfa7d92" containerID="0ab4e6d31aaf6b5d3dccb9d9267119d8c2e26000f2d36d76f8420ec8d45af974" exitCode=0 Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.626739 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-e19f-account-create-f8h75" event={"ID":"19b8e9b5-da69-4679-a7e9-471cfcfa7d92","Type":"ContainerDied","Data":"0ab4e6d31aaf6b5d3dccb9d9267119d8c2e26000f2d36d76f8420ec8d45af974"} Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.626767 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-e19f-account-create-f8h75" event={"ID":"19b8e9b5-da69-4679-a7e9-471cfcfa7d92","Type":"ContainerStarted","Data":"1ddf103ef3ef1066374229a95467c307ee6d6be250d59cad5ee501a692fd466e"} Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.629068 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9rd88" event={"ID":"c4791fb2-ae33-4758-824f-4b6b7ae9b4ea","Type":"ContainerDied","Data":"0aba95572f5785bce8bd0f1116528bf9e2302e8cedb5c9f8639ac37a981588b7"} Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.629132 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aba95572f5785bce8bd0f1116528bf9e2302e8cedb5c9f8639ac37a981588b7" Oct 06 13:20:28 crc kubenswrapper[4867]: I1006 13:20:28.629223 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9rd88" Oct 06 13:20:29 crc kubenswrapper[4867]: I1006 13:20:29.638985 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f249bfb-ab86-491d-9d1c-b3930fdea27d" containerID="ed05e3e744e02f905cc03b57571a1f3fbc2df72ee0c2ceefc3ed6b911570636c" exitCode=0 Oct 06 13:20:29 crc kubenswrapper[4867]: I1006 13:20:29.639051 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9f249bfb-ab86-491d-9d1c-b3930fdea27d","Type":"ContainerDied","Data":"ed05e3e744e02f905cc03b57571a1f3fbc2df72ee0c2ceefc3ed6b911570636c"} Oct 06 13:20:30 crc kubenswrapper[4867]: I1006 13:20:30.022570 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-e19f-account-create-f8h75" Oct 06 13:20:30 crc kubenswrapper[4867]: I1006 13:20:30.097931 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlwg7\" (UniqueName: \"kubernetes.io/projected/19b8e9b5-da69-4679-a7e9-471cfcfa7d92-kube-api-access-dlwg7\") pod \"19b8e9b5-da69-4679-a7e9-471cfcfa7d92\" (UID: \"19b8e9b5-da69-4679-a7e9-471cfcfa7d92\") " Oct 06 13:20:30 crc kubenswrapper[4867]: I1006 13:20:30.104314 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b8e9b5-da69-4679-a7e9-471cfcfa7d92-kube-api-access-dlwg7" (OuterVolumeSpecName: "kube-api-access-dlwg7") pod "19b8e9b5-da69-4679-a7e9-471cfcfa7d92" (UID: "19b8e9b5-da69-4679-a7e9-471cfcfa7d92"). InnerVolumeSpecName "kube-api-access-dlwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:20:30 crc kubenswrapper[4867]: I1006 13:20:30.200658 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlwg7\" (UniqueName: \"kubernetes.io/projected/19b8e9b5-da69-4679-a7e9-471cfcfa7d92-kube-api-access-dlwg7\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:30 crc kubenswrapper[4867]: I1006 13:20:30.649044 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9f249bfb-ab86-491d-9d1c-b3930fdea27d","Type":"ContainerStarted","Data":"af16685efe24a8c80bf1e736c7f8f054bede6974e494d223cd487ec58a9fbfa8"} Oct 06 13:20:30 crc kubenswrapper[4867]: I1006 13:20:30.649874 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:20:30 crc kubenswrapper[4867]: I1006 13:20:30.654767 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-e19f-account-create-f8h75" event={"ID":"19b8e9b5-da69-4679-a7e9-471cfcfa7d92","Type":"ContainerDied","Data":"1ddf103ef3ef1066374229a95467c307ee6d6be250d59cad5ee501a692fd466e"} Oct 06 13:20:30 crc kubenswrapper[4867]: I1006 13:20:30.654810 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ddf103ef3ef1066374229a95467c307ee6d6be250d59cad5ee501a692fd466e" Oct 06 13:20:30 crc kubenswrapper[4867]: I1006 13:20:30.654790 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-e19f-account-create-f8h75" Oct 06 13:20:30 crc kubenswrapper[4867]: I1006 13:20:30.675062 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=51.062617818 podStartE2EDuration="1m0.675041991s" podCreationTimestamp="2025-10-06 13:19:30 +0000 UTC" firstStartedPulling="2025-10-06 13:19:43.954041675 +0000 UTC m=+963.411989819" lastFinishedPulling="2025-10-06 13:19:53.566465848 +0000 UTC m=+973.024413992" observedRunningTime="2025-10-06 13:20:30.673646943 +0000 UTC m=+1010.131595107" watchObservedRunningTime="2025-10-06 13:20:30.675041991 +0000 UTC m=+1010.132990135" Oct 06 13:20:30 crc kubenswrapper[4867]: I1006 13:20:30.826660 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-tg8j4" podUID="68750dd5-11c8-4fee-853c-09b68df5aff8" containerName="ovn-controller" probeResult="failure" output=< Oct 06 13:20:30 crc kubenswrapper[4867]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 06 13:20:30 crc kubenswrapper[4867]: > Oct 06 13:20:32 crc kubenswrapper[4867]: I1006 13:20:32.674138 4867 generic.go:334] "Generic (PLEG): container finished" podID="acd7f8b9-810f-4e76-b971-c466bf7d4a5b" containerID="20db4aef3d1f6f5beead11f0cb913ab6379935aac4adfe39811cc35e1f27258f" exitCode=0 Oct 06 13:20:32 crc kubenswrapper[4867]: I1006 13:20:32.674178 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"acd7f8b9-810f-4e76-b971-c466bf7d4a5b","Type":"ContainerDied","Data":"20db4aef3d1f6f5beead11f0cb913ab6379935aac4adfe39811cc35e1f27258f"} Oct 06 13:20:33 crc kubenswrapper[4867]: I1006 13:20:33.690125 4867 generic.go:334] "Generic (PLEG): container finished" podID="4beec03b-3d57-4c36-a149-153bb022bd7a" containerID="d9a9433273d2f2fb162c39c0ba1cfae30efc9f88f8074ac3aba59f371dc84a67" exitCode=0 Oct 06 13:20:33 crc kubenswrapper[4867]: I1006 13:20:33.690188 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"4beec03b-3d57-4c36-a149-153bb022bd7a","Type":"ContainerDied","Data":"d9a9433273d2f2fb162c39c0ba1cfae30efc9f88f8074ac3aba59f371dc84a67"} Oct 06 13:20:33 crc kubenswrapper[4867]: I1006 13:20:33.696439 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"acd7f8b9-810f-4e76-b971-c466bf7d4a5b","Type":"ContainerStarted","Data":"6a3bf3ee75000bacfe61c88134793e16f0291f997758d5889bd76b010a207d68"} Oct 06 13:20:33 crc kubenswrapper[4867]: I1006 13:20:33.696758 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 06 13:20:33 crc kubenswrapper[4867]: I1006 13:20:33.752705 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=55.200915558 podStartE2EDuration="1m4.752670812s" podCreationTimestamp="2025-10-06 13:19:29 +0000 UTC" firstStartedPulling="2025-10-06 13:19:45.121158617 +0000 UTC m=+964.579106751" lastFinishedPulling="2025-10-06 13:19:54.672913861 +0000 UTC m=+974.130862005" observedRunningTime="2025-10-06 13:20:33.74893351 +0000 UTC m=+1013.206881654" watchObservedRunningTime="2025-10-06 13:20:33.752670812 +0000 UTC m=+1013.210618986" Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.712943 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"4beec03b-3d57-4c36-a149-153bb022bd7a","Type":"ContainerStarted","Data":"bd98630efea5c16f35168e15f5d37ea4d79e0805601946e3750d22e001d992a7"} Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.713765 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.752077 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=55.17524686 podStartE2EDuration="1m4.752046478s" podCreationTimestamp="2025-10-06 13:19:30 +0000 UTC" firstStartedPulling="2025-10-06 13:19:45.096210185 +0000 UTC m=+964.554158329" lastFinishedPulling="2025-10-06 13:19:54.673009803 +0000 UTC m=+974.130957947" observedRunningTime="2025-10-06 13:20:34.747295598 +0000 UTC m=+1014.205243752" watchObservedRunningTime="2025-10-06 13:20:34.752046478 +0000 UTC m=+1014.209994622" Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.866462 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-868b-account-create-pg9d2"] Oct 06 13:20:34 crc kubenswrapper[4867]: E1006 13:20:34.866979 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab09423-89f0-4694-a961-9813755dfd88" containerName="mariadb-database-create" Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.867002 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab09423-89f0-4694-a961-9813755dfd88" containerName="mariadb-database-create" Oct 06 13:20:34 crc kubenswrapper[4867]: E1006 13:20:34.867027 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b8e9b5-da69-4679-a7e9-471cfcfa7d92" containerName="mariadb-account-create" Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.867037 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b8e9b5-da69-4679-a7e9-471cfcfa7d92" containerName="mariadb-account-create" Oct 06 13:20:34 crc kubenswrapper[4867]: E1006 13:20:34.867055 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad94d2b3-0f12-4bed-82c2-de7289914d0b" containerName="mariadb-database-create" Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.867063 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad94d2b3-0f12-4bed-82c2-de7289914d0b" containerName="mariadb-database-create" Oct 06 13:20:34 crc kubenswrapper[4867]: E1006 13:20:34.867081 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4791fb2-ae33-4758-824f-4b6b7ae9b4ea" containerName="mariadb-database-create" Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.867089 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4791fb2-ae33-4758-824f-4b6b7ae9b4ea" containerName="mariadb-database-create" Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.867325 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4791fb2-ae33-4758-824f-4b6b7ae9b4ea" containerName="mariadb-database-create" Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.867356 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad94d2b3-0f12-4bed-82c2-de7289914d0b" containerName="mariadb-database-create" Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.867387 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab09423-89f0-4694-a961-9813755dfd88" containerName="mariadb-database-create" Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.867413 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b8e9b5-da69-4679-a7e9-471cfcfa7d92" containerName="mariadb-account-create" Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.868157 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-868b-account-create-pg9d2" Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.872205 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.881898 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-868b-account-create-pg9d2"] Oct 06 13:20:34 crc kubenswrapper[4867]: I1006 13:20:34.998360 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7l8w\" (UniqueName: \"kubernetes.io/projected/c62596f3-6c68-47df-9960-c4aa7e5af8fa-kube-api-access-l7l8w\") pod \"keystone-868b-account-create-pg9d2\" (UID: \"c62596f3-6c68-47df-9960-c4aa7e5af8fa\") " pod="openstack/keystone-868b-account-create-pg9d2" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.057303 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-a51c-account-create-kfp55"] Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.058391 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a51c-account-create-kfp55" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.060488 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.077437 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a51c-account-create-kfp55"] Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.104789 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7l8w\" (UniqueName: \"kubernetes.io/projected/c62596f3-6c68-47df-9960-c4aa7e5af8fa-kube-api-access-l7l8w\") pod \"keystone-868b-account-create-pg9d2\" (UID: \"c62596f3-6c68-47df-9960-c4aa7e5af8fa\") " pod="openstack/keystone-868b-account-create-pg9d2" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.124198 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7l8w\" (UniqueName: \"kubernetes.io/projected/c62596f3-6c68-47df-9960-c4aa7e5af8fa-kube-api-access-l7l8w\") pod \"keystone-868b-account-create-pg9d2\" (UID: \"c62596f3-6c68-47df-9960-c4aa7e5af8fa\") " pod="openstack/keystone-868b-account-create-pg9d2" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.188793 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-868b-account-create-pg9d2" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.209776 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzsw9\" (UniqueName: \"kubernetes.io/projected/0e6482c5-e172-4c9b-820f-e3b7f81435fa-kube-api-access-gzsw9\") pod \"placement-a51c-account-create-kfp55\" (UID: \"0e6482c5-e172-4c9b-820f-e3b7f81435fa\") " pod="openstack/placement-a51c-account-create-kfp55" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.311525 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzsw9\" (UniqueName: \"kubernetes.io/projected/0e6482c5-e172-4c9b-820f-e3b7f81435fa-kube-api-access-gzsw9\") pod \"placement-a51c-account-create-kfp55\" (UID: \"0e6482c5-e172-4c9b-820f-e3b7f81435fa\") " pod="openstack/placement-a51c-account-create-kfp55" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.337153 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzsw9\" (UniqueName: \"kubernetes.io/projected/0e6482c5-e172-4c9b-820f-e3b7f81435fa-kube-api-access-gzsw9\") pod \"placement-a51c-account-create-kfp55\" (UID: \"0e6482c5-e172-4c9b-820f-e3b7f81435fa\") " pod="openstack/placement-a51c-account-create-kfp55" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.381802 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a51c-account-create-kfp55" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.444947 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-380c-account-create-ppfbr"] Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.446167 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-380c-account-create-ppfbr" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.452887 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-380c-account-create-ppfbr"] Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.458344 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.621265 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9swcw\" (UniqueName: \"kubernetes.io/projected/f8070856-d90e-4aa0-97ca-5d0be29da723-kube-api-access-9swcw\") pod \"glance-380c-account-create-ppfbr\" (UID: \"f8070856-d90e-4aa0-97ca-5d0be29da723\") " pod="openstack/glance-380c-account-create-ppfbr" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.722807 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9swcw\" (UniqueName: \"kubernetes.io/projected/f8070856-d90e-4aa0-97ca-5d0be29da723-kube-api-access-9swcw\") pod \"glance-380c-account-create-ppfbr\" (UID: \"f8070856-d90e-4aa0-97ca-5d0be29da723\") " pod="openstack/glance-380c-account-create-ppfbr" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.741569 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-868b-account-create-pg9d2"] Oct 06 13:20:35 crc kubenswrapper[4867]: W1006 13:20:35.744901 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc62596f3_6c68_47df_9960_c4aa7e5af8fa.slice/crio-871c8c284488e2e49ba613c7be51ee44502a8ee8670a605968585eeecde9bec1 WatchSource:0}: Error finding container 871c8c284488e2e49ba613c7be51ee44502a8ee8670a605968585eeecde9bec1: Status 404 returned error can't find the container with id 871c8c284488e2e49ba613c7be51ee44502a8ee8670a605968585eeecde9bec1 Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.755121 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9swcw\" (UniqueName: \"kubernetes.io/projected/f8070856-d90e-4aa0-97ca-5d0be29da723-kube-api-access-9swcw\") pod \"glance-380c-account-create-ppfbr\" (UID: \"f8070856-d90e-4aa0-97ca-5d0be29da723\") " pod="openstack/glance-380c-account-create-ppfbr" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.777728 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-380c-account-create-ppfbr" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.936338 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:20:35 crc kubenswrapper[4867]: I1006 13:20:35.964484 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a51c-account-create-kfp55"] Oct 06 13:20:35 crc kubenswrapper[4867]: W1006 13:20:35.965570 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e6482c5_e172_4c9b_820f_e3b7f81435fa.slice/crio-2bccf96e4c0c1ca0664315c2f13d00beff431dec5163e97e43ae63b1c092711f WatchSource:0}: Error finding container 2bccf96e4c0c1ca0664315c2f13d00beff431dec5163e97e43ae63b1c092711f: Status 404 returned error can't find the container with id 2bccf96e4c0c1ca0664315c2f13d00beff431dec5163e97e43ae63b1c092711f Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.076482 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-tg8j4" podUID="68750dd5-11c8-4fee-853c-09b68df5aff8" containerName="ovn-controller" probeResult="failure" output=< Oct 06 13:20:36 crc kubenswrapper[4867]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 06 13:20:36 crc kubenswrapper[4867]: > Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.218975 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-k22cm" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.550677 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-380c-account-create-ppfbr"] Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.575135 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-tg8j4-config-smrwz"] Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.576558 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.579431 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.592891 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tg8j4-config-smrwz"] Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.653197 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-log-ovn\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.653341 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-run\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.653416 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt7f4\" (UniqueName: \"kubernetes.io/projected/5a5a9728-78a6-4dff-81e9-dc3f2d768220-kube-api-access-gt7f4\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.653521 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5a5a9728-78a6-4dff-81e9-dc3f2d768220-additional-scripts\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.653683 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-run-ovn\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.653788 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a5a9728-78a6-4dff-81e9-dc3f2d768220-scripts\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.731812 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-380c-account-create-ppfbr" event={"ID":"f8070856-d90e-4aa0-97ca-5d0be29da723","Type":"ContainerStarted","Data":"7d8ac331666fc9f13503d3007bed3a3244e17841e8cfe6bbef921c692b82c47e"} Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.733591 4867 generic.go:334] "Generic (PLEG): container finished" podID="0e6482c5-e172-4c9b-820f-e3b7f81435fa" containerID="9e589270dd9f918321d1dd58661bb514f87584f94d108fa7a50be940780a76fc" exitCode=0 Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.733654 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a51c-account-create-kfp55" event={"ID":"0e6482c5-e172-4c9b-820f-e3b7f81435fa","Type":"ContainerDied","Data":"9e589270dd9f918321d1dd58661bb514f87584f94d108fa7a50be940780a76fc"} Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.733671 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a51c-account-create-kfp55" event={"ID":"0e6482c5-e172-4c9b-820f-e3b7f81435fa","Type":"ContainerStarted","Data":"2bccf96e4c0c1ca0664315c2f13d00beff431dec5163e97e43ae63b1c092711f"} Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.734914 4867 generic.go:334] "Generic (PLEG): container finished" podID="c62596f3-6c68-47df-9960-c4aa7e5af8fa" containerID="745ca90ca625a445bbf6e09d15c963dd3ec322db3b4e9b48374176ea91fe4375" exitCode=0 Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.734968 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-868b-account-create-pg9d2" event={"ID":"c62596f3-6c68-47df-9960-c4aa7e5af8fa","Type":"ContainerDied","Data":"745ca90ca625a445bbf6e09d15c963dd3ec322db3b4e9b48374176ea91fe4375"} Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.735019 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-868b-account-create-pg9d2" event={"ID":"c62596f3-6c68-47df-9960-c4aa7e5af8fa","Type":"ContainerStarted","Data":"871c8c284488e2e49ba613c7be51ee44502a8ee8670a605968585eeecde9bec1"} Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.755517 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5a5a9728-78a6-4dff-81e9-dc3f2d768220-additional-scripts\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.755603 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-run-ovn\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.755638 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a5a9728-78a6-4dff-81e9-dc3f2d768220-scripts\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.755696 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-log-ovn\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.755753 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-run\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.755827 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt7f4\" (UniqueName: \"kubernetes.io/projected/5a5a9728-78a6-4dff-81e9-dc3f2d768220-kube-api-access-gt7f4\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.756235 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5a5a9728-78a6-4dff-81e9-dc3f2d768220-additional-scripts\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.756476 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-run\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.756483 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-log-ovn\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.756526 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-run-ovn\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.758797 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a5a9728-78a6-4dff-81e9-dc3f2d768220-scripts\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:36 crc kubenswrapper[4867]: I1006 13:20:36.789360 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt7f4\" (UniqueName: \"kubernetes.io/projected/5a5a9728-78a6-4dff-81e9-dc3f2d768220-kube-api-access-gt7f4\") pod \"ovn-controller-tg8j4-config-smrwz\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:37 crc kubenswrapper[4867]: I1006 13:20:37.001059 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:37 crc kubenswrapper[4867]: I1006 13:20:37.500784 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tg8j4-config-smrwz"] Oct 06 13:20:37 crc kubenswrapper[4867]: I1006 13:20:37.745060 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tg8j4-config-smrwz" event={"ID":"5a5a9728-78a6-4dff-81e9-dc3f2d768220","Type":"ContainerStarted","Data":"1ba81fd9b5431890021d2b387b6081c1c0dd9235e0ec83d26ccfb4e2a7729803"} Oct 06 13:20:37 crc kubenswrapper[4867]: I1006 13:20:37.750569 4867 generic.go:334] "Generic (PLEG): container finished" podID="f8070856-d90e-4aa0-97ca-5d0be29da723" containerID="41ae2d4db2e143e487e94bc760ad25f9850fe18b989158e1c111b765da334fd5" exitCode=0 Oct 06 13:20:37 crc kubenswrapper[4867]: I1006 13:20:37.750657 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-380c-account-create-ppfbr" event={"ID":"f8070856-d90e-4aa0-97ca-5d0be29da723","Type":"ContainerDied","Data":"41ae2d4db2e143e487e94bc760ad25f9850fe18b989158e1c111b765da334fd5"} Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.159640 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-868b-account-create-pg9d2" Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.229292 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a51c-account-create-kfp55" Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.292747 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.293937 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7l8w\" (UniqueName: \"kubernetes.io/projected/c62596f3-6c68-47df-9960-c4aa7e5af8fa-kube-api-access-l7l8w\") pod \"c62596f3-6c68-47df-9960-c4aa7e5af8fa\" (UID: \"c62596f3-6c68-47df-9960-c4aa7e5af8fa\") " Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.295264 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.300277 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c62596f3-6c68-47df-9960-c4aa7e5af8fa-kube-api-access-l7l8w" (OuterVolumeSpecName: "kube-api-access-l7l8w") pod "c62596f3-6c68-47df-9960-c4aa7e5af8fa" (UID: "c62596f3-6c68-47df-9960-c4aa7e5af8fa"). InnerVolumeSpecName "kube-api-access-l7l8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.395407 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzsw9\" (UniqueName: \"kubernetes.io/projected/0e6482c5-e172-4c9b-820f-e3b7f81435fa-kube-api-access-gzsw9\") pod \"0e6482c5-e172-4c9b-820f-e3b7f81435fa\" (UID: \"0e6482c5-e172-4c9b-820f-e3b7f81435fa\") " Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.396743 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7l8w\" (UniqueName: \"kubernetes.io/projected/c62596f3-6c68-47df-9960-c4aa7e5af8fa-kube-api-access-l7l8w\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.399075 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e6482c5-e172-4c9b-820f-e3b7f81435fa-kube-api-access-gzsw9" (OuterVolumeSpecName: "kube-api-access-gzsw9") pod "0e6482c5-e172-4c9b-820f-e3b7f81435fa" (UID: "0e6482c5-e172-4c9b-820f-e3b7f81435fa"). InnerVolumeSpecName "kube-api-access-gzsw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.498926 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzsw9\" (UniqueName: \"kubernetes.io/projected/0e6482c5-e172-4c9b-820f-e3b7f81435fa-kube-api-access-gzsw9\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.761662 4867 generic.go:334] "Generic (PLEG): container finished" podID="5a5a9728-78a6-4dff-81e9-dc3f2d768220" containerID="ad47d6a4f7a6e3e30fb468a8e02c5f75a0cb02c3c527a4d939f3c3408897b0af" exitCode=0 Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.761794 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tg8j4-config-smrwz" event={"ID":"5a5a9728-78a6-4dff-81e9-dc3f2d768220","Type":"ContainerDied","Data":"ad47d6a4f7a6e3e30fb468a8e02c5f75a0cb02c3c527a4d939f3c3408897b0af"} Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.764520 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a51c-account-create-kfp55" event={"ID":"0e6482c5-e172-4c9b-820f-e3b7f81435fa","Type":"ContainerDied","Data":"2bccf96e4c0c1ca0664315c2f13d00beff431dec5163e97e43ae63b1c092711f"} Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.764566 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a51c-account-create-kfp55" Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.764570 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bccf96e4c0c1ca0664315c2f13d00beff431dec5163e97e43ae63b1c092711f" Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.772454 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-868b-account-create-pg9d2" event={"ID":"c62596f3-6c68-47df-9960-c4aa7e5af8fa","Type":"ContainerDied","Data":"871c8c284488e2e49ba613c7be51ee44502a8ee8670a605968585eeecde9bec1"} Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.772490 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="871c8c284488e2e49ba613c7be51ee44502a8ee8670a605968585eeecde9bec1" Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.772520 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-868b-account-create-pg9d2" Oct 06 13:20:38 crc kubenswrapper[4867]: I1006 13:20:38.774094 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:39 crc kubenswrapper[4867]: I1006 13:20:39.252656 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-380c-account-create-ppfbr" Oct 06 13:20:39 crc kubenswrapper[4867]: I1006 13:20:39.320524 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9swcw\" (UniqueName: \"kubernetes.io/projected/f8070856-d90e-4aa0-97ca-5d0be29da723-kube-api-access-9swcw\") pod \"f8070856-d90e-4aa0-97ca-5d0be29da723\" (UID: \"f8070856-d90e-4aa0-97ca-5d0be29da723\") " Oct 06 13:20:39 crc kubenswrapper[4867]: I1006 13:20:39.329577 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8070856-d90e-4aa0-97ca-5d0be29da723-kube-api-access-9swcw" (OuterVolumeSpecName: "kube-api-access-9swcw") pod "f8070856-d90e-4aa0-97ca-5d0be29da723" (UID: "f8070856-d90e-4aa0-97ca-5d0be29da723"). InnerVolumeSpecName "kube-api-access-9swcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:20:39 crc kubenswrapper[4867]: I1006 13:20:39.424018 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9swcw\" (UniqueName: \"kubernetes.io/projected/f8070856-d90e-4aa0-97ca-5d0be29da723-kube-api-access-9swcw\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:39 crc kubenswrapper[4867]: I1006 13:20:39.781628 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-380c-account-create-ppfbr" event={"ID":"f8070856-d90e-4aa0-97ca-5d0be29da723","Type":"ContainerDied","Data":"7d8ac331666fc9f13503d3007bed3a3244e17841e8cfe6bbef921c692b82c47e"} Oct 06 13:20:39 crc kubenswrapper[4867]: I1006 13:20:39.781687 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d8ac331666fc9f13503d3007bed3a3244e17841e8cfe6bbef921c692b82c47e" Oct 06 13:20:39 crc kubenswrapper[4867]: I1006 13:20:39.782771 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-380c-account-create-ppfbr" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.124627 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.136983 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.143308 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc7edd17-2d19-4949-8849-9a62cd86e861-etc-swift\") pod \"swift-storage-0\" (UID: \"dc7edd17-2d19-4949-8849-9a62cd86e861\") " pod="openstack/swift-storage-0" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.239608 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-run\") pod \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.239769 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a5a9728-78a6-4dff-81e9-dc3f2d768220-scripts\") pod \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.239791 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-log-ovn\") pod \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.239832 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5a5a9728-78a6-4dff-81e9-dc3f2d768220-additional-scripts\") pod \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.239881 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-run-ovn\") pod \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.239922 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt7f4\" (UniqueName: \"kubernetes.io/projected/5a5a9728-78a6-4dff-81e9-dc3f2d768220-kube-api-access-gt7f4\") pod \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\" (UID: \"5a5a9728-78a6-4dff-81e9-dc3f2d768220\") " Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.240239 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5a5a9728-78a6-4dff-81e9-dc3f2d768220" (UID: "5a5a9728-78a6-4dff-81e9-dc3f2d768220"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.240423 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-run" (OuterVolumeSpecName: "var-run") pod "5a5a9728-78a6-4dff-81e9-dc3f2d768220" (UID: "5a5a9728-78a6-4dff-81e9-dc3f2d768220"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.240467 4867 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.240955 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a5a9728-78a6-4dff-81e9-dc3f2d768220-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5a5a9728-78a6-4dff-81e9-dc3f2d768220" (UID: "5a5a9728-78a6-4dff-81e9-dc3f2d768220"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.241068 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5a5a9728-78a6-4dff-81e9-dc3f2d768220" (UID: "5a5a9728-78a6-4dff-81e9-dc3f2d768220"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.241286 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a5a9728-78a6-4dff-81e9-dc3f2d768220-scripts" (OuterVolumeSpecName: "scripts") pod "5a5a9728-78a6-4dff-81e9-dc3f2d768220" (UID: "5a5a9728-78a6-4dff-81e9-dc3f2d768220"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.244853 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a5a9728-78a6-4dff-81e9-dc3f2d768220-kube-api-access-gt7f4" (OuterVolumeSpecName: "kube-api-access-gt7f4") pod "5a5a9728-78a6-4dff-81e9-dc3f2d768220" (UID: "5a5a9728-78a6-4dff-81e9-dc3f2d768220"). InnerVolumeSpecName "kube-api-access-gt7f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.259092 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.341914 4867 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.342323 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a5a9728-78a6-4dff-81e9-dc3f2d768220-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.342334 4867 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5a5a9728-78a6-4dff-81e9-dc3f2d768220-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.342345 4867 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5a5a9728-78a6-4dff-81e9-dc3f2d768220-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.342356 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt7f4\" (UniqueName: \"kubernetes.io/projected/5a5a9728-78a6-4dff-81e9-dc3f2d768220-kube-api-access-gt7f4\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.515173 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-5kc59"] Oct 06 13:20:40 crc kubenswrapper[4867]: E1006 13:20:40.515583 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8070856-d90e-4aa0-97ca-5d0be29da723" containerName="mariadb-account-create" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.515602 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8070856-d90e-4aa0-97ca-5d0be29da723" containerName="mariadb-account-create" Oct 06 13:20:40 crc kubenswrapper[4867]: E1006 13:20:40.515613 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6482c5-e172-4c9b-820f-e3b7f81435fa" containerName="mariadb-account-create" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.515619 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6482c5-e172-4c9b-820f-e3b7f81435fa" containerName="mariadb-account-create" Oct 06 13:20:40 crc kubenswrapper[4867]: E1006 13:20:40.515631 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c62596f3-6c68-47df-9960-c4aa7e5af8fa" containerName="mariadb-account-create" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.515637 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c62596f3-6c68-47df-9960-c4aa7e5af8fa" containerName="mariadb-account-create" Oct 06 13:20:40 crc kubenswrapper[4867]: E1006 13:20:40.515652 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a5a9728-78a6-4dff-81e9-dc3f2d768220" containerName="ovn-config" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.515658 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a5a9728-78a6-4dff-81e9-dc3f2d768220" containerName="ovn-config" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.515803 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c62596f3-6c68-47df-9960-c4aa7e5af8fa" containerName="mariadb-account-create" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.515814 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6482c5-e172-4c9b-820f-e3b7f81435fa" containerName="mariadb-account-create" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.515834 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a5a9728-78a6-4dff-81e9-dc3f2d768220" containerName="ovn-config" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.515844 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8070856-d90e-4aa0-97ca-5d0be29da723" containerName="mariadb-account-create" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.516446 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5kc59" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.521948 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vgsz4" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.523519 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.536390 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-5kc59"] Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.652724 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sm5t\" (UniqueName: \"kubernetes.io/projected/0efadce0-13dd-4a6d-9a54-9deabd1e8069-kube-api-access-2sm5t\") pod \"glance-db-sync-5kc59\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " pod="openstack/glance-db-sync-5kc59" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.652795 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-config-data\") pod \"glance-db-sync-5kc59\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " pod="openstack/glance-db-sync-5kc59" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.652889 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-db-sync-config-data\") pod \"glance-db-sync-5kc59\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " pod="openstack/glance-db-sync-5kc59" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.653112 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-combined-ca-bundle\") pod \"glance-db-sync-5kc59\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " pod="openstack/glance-db-sync-5kc59" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.755176 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sm5t\" (UniqueName: \"kubernetes.io/projected/0efadce0-13dd-4a6d-9a54-9deabd1e8069-kube-api-access-2sm5t\") pod \"glance-db-sync-5kc59\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " pod="openstack/glance-db-sync-5kc59" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.755244 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-config-data\") pod \"glance-db-sync-5kc59\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " pod="openstack/glance-db-sync-5kc59" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.755356 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-db-sync-config-data\") pod \"glance-db-sync-5kc59\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " pod="openstack/glance-db-sync-5kc59" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.755389 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-combined-ca-bundle\") pod \"glance-db-sync-5kc59\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " pod="openstack/glance-db-sync-5kc59" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.760879 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-db-sync-config-data\") pod \"glance-db-sync-5kc59\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " pod="openstack/glance-db-sync-5kc59" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.760893 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-config-data\") pod \"glance-db-sync-5kc59\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " pod="openstack/glance-db-sync-5kc59" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.763854 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-combined-ca-bundle\") pod \"glance-db-sync-5kc59\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " pod="openstack/glance-db-sync-5kc59" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.791793 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sm5t\" (UniqueName: \"kubernetes.io/projected/0efadce0-13dd-4a6d-9a54-9deabd1e8069-kube-api-access-2sm5t\") pod \"glance-db-sync-5kc59\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " pod="openstack/glance-db-sync-5kc59" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.797172 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tg8j4-config-smrwz" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.802193 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tg8j4-config-smrwz" event={"ID":"5a5a9728-78a6-4dff-81e9-dc3f2d768220","Type":"ContainerDied","Data":"1ba81fd9b5431890021d2b387b6081c1c0dd9235e0ec83d26ccfb4e2a7729803"} Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.802777 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ba81fd9b5431890021d2b387b6081c1c0dd9235e0ec83d26ccfb4e2a7729803" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.838937 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5kc59" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.894566 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-tg8j4" Oct 06 13:20:40 crc kubenswrapper[4867]: I1006 13:20:40.937466 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 06 13:20:40 crc kubenswrapper[4867]: W1006 13:20:40.949138 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc7edd17_2d19_4949_8849_9a62cd86e861.slice/crio-abccee43f649252c284a3e83585cbd31af506dfc488707e7415081a7ebed62ad WatchSource:0}: Error finding container abccee43f649252c284a3e83585cbd31af506dfc488707e7415081a7ebed62ad: Status 404 returned error can't find the container with id abccee43f649252c284a3e83585cbd31af506dfc488707e7415081a7ebed62ad Oct 06 13:20:41 crc kubenswrapper[4867]: I1006 13:20:41.256400 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-tg8j4-config-smrwz"] Oct 06 13:20:41 crc kubenswrapper[4867]: I1006 13:20:41.268334 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-tg8j4-config-smrwz"] Oct 06 13:20:41 crc kubenswrapper[4867]: I1006 13:20:41.463158 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-5kc59"] Oct 06 13:20:41 crc kubenswrapper[4867]: I1006 13:20:41.558084 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="9f249bfb-ab86-491d-9d1c-b3930fdea27d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Oct 06 13:20:41 crc kubenswrapper[4867]: I1006 13:20:41.806676 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"abccee43f649252c284a3e83585cbd31af506dfc488707e7415081a7ebed62ad"} Oct 06 13:20:41 crc kubenswrapper[4867]: I1006 13:20:41.808478 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5kc59" event={"ID":"0efadce0-13dd-4a6d-9a54-9deabd1e8069","Type":"ContainerStarted","Data":"d689693337e4946b42114cdb55f0dc0fb02689a720e8692afb40a2667cde3046"} Oct 06 13:20:41 crc kubenswrapper[4867]: I1006 13:20:41.847354 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:20:41 crc kubenswrapper[4867]: I1006 13:20:41.847701 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerName="prometheus" containerID="cri-o://74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d" gracePeriod=600 Oct 06 13:20:41 crc kubenswrapper[4867]: I1006 13:20:41.847809 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerName="config-reloader" containerID="cri-o://0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe" gracePeriod=600 Oct 06 13:20:41 crc kubenswrapper[4867]: I1006 13:20:41.847821 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerName="thanos-sidecar" containerID="cri-o://49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389" gracePeriod=600 Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.641212 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.809734 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-config\") pod \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.809829 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-web-config\") pod \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.809974 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") pod \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.809999 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-tls-assets\") pod \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.810179 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-prometheus-metric-storage-rulefiles-0\") pod \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.810208 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-config-out\") pod \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.810300 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69cpj\" (UniqueName: \"kubernetes.io/projected/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-kube-api-access-69cpj\") pod \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.810329 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-thanos-prometheus-http-client-file\") pod \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\" (UID: \"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd\") " Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.818778 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-config" (OuterVolumeSpecName: "config") pod "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" (UID: "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.820396 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" (UID: "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.821761 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" (UID: "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.828386 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-config-out" (OuterVolumeSpecName: "config-out") pod "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" (UID: "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.828953 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" (UID: "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.831073 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-kube-api-access-69cpj" (OuterVolumeSpecName: "kube-api-access-69cpj") pod "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" (UID: "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd"). InnerVolumeSpecName "kube-api-access-69cpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.837520 4867 generic.go:334] "Generic (PLEG): container finished" podID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerID="49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389" exitCode=0 Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.837575 4867 generic.go:334] "Generic (PLEG): container finished" podID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerID="0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe" exitCode=0 Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.837583 4867 generic.go:334] "Generic (PLEG): container finished" podID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerID="74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d" exitCode=0 Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.837744 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.837766 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd","Type":"ContainerDied","Data":"49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389"} Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.837805 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd","Type":"ContainerDied","Data":"0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe"} Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.837818 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd","Type":"ContainerDied","Data":"74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d"} Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.837830 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd","Type":"ContainerDied","Data":"72bd4e9b36729f7985e821cedca4e8b59ddfd2a1408f9cd6ff440831de573b83"} Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.837855 4867 scope.go:117] "RemoveContainer" containerID="49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.849060 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"7e3eeeb6d44166bc477cf22c33fa985e11bac4c31054691c03236859e3a2b944"} Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.849125 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"80a433a7dd5bc5f864ef9911a44f9749206417d21eb94d2e1e545ee10e9d2055"} Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.850468 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" (UID: "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd"). InnerVolumeSpecName "pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.857581 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-web-config" (OuterVolumeSpecName: "web-config") pod "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" (UID: "2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.866200 4867 scope.go:117] "RemoveContainer" containerID="0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.874191 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.874537 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.909463 4867 scope.go:117] "RemoveContainer" containerID="74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.915114 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.915156 4867 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-web-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.915218 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") on node \"crc\" " Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.915239 4867 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.915307 4867 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.915327 4867 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-config-out\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.915341 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69cpj\" (UniqueName: \"kubernetes.io/projected/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-kube-api-access-69cpj\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.915354 4867 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.939055 4867 scope.go:117] "RemoveContainer" containerID="c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.963132 4867 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.963675 4867 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4") on node "crc" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.968376 4867 scope.go:117] "RemoveContainer" containerID="49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389" Oct 06 13:20:42 crc kubenswrapper[4867]: E1006 13:20:42.969804 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389\": container with ID starting with 49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389 not found: ID does not exist" containerID="49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.969845 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389"} err="failed to get container status \"49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389\": rpc error: code = NotFound desc = could not find container \"49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389\": container with ID starting with 49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389 not found: ID does not exist" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.969875 4867 scope.go:117] "RemoveContainer" containerID="0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe" Oct 06 13:20:42 crc kubenswrapper[4867]: E1006 13:20:42.970763 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe\": container with ID starting with 0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe not found: ID does not exist" containerID="0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.970822 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe"} err="failed to get container status \"0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe\": rpc error: code = NotFound desc = could not find container \"0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe\": container with ID starting with 0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe not found: ID does not exist" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.970867 4867 scope.go:117] "RemoveContainer" containerID="74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d" Oct 06 13:20:42 crc kubenswrapper[4867]: E1006 13:20:42.971397 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d\": container with ID starting with 74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d not found: ID does not exist" containerID="74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.971432 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d"} err="failed to get container status \"74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d\": rpc error: code = NotFound desc = could not find container \"74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d\": container with ID starting with 74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d not found: ID does not exist" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.971452 4867 scope.go:117] "RemoveContainer" containerID="c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4" Oct 06 13:20:42 crc kubenswrapper[4867]: E1006 13:20:42.972112 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4\": container with ID starting with c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4 not found: ID does not exist" containerID="c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.972140 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4"} err="failed to get container status \"c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4\": rpc error: code = NotFound desc = could not find container \"c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4\": container with ID starting with c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4 not found: ID does not exist" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.972156 4867 scope.go:117] "RemoveContainer" containerID="49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.972580 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389"} err="failed to get container status \"49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389\": rpc error: code = NotFound desc = could not find container \"49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389\": container with ID starting with 49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389 not found: ID does not exist" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.972601 4867 scope.go:117] "RemoveContainer" containerID="0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.973051 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe"} err="failed to get container status \"0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe\": rpc error: code = NotFound desc = could not find container \"0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe\": container with ID starting with 0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe not found: ID does not exist" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.973096 4867 scope.go:117] "RemoveContainer" containerID="74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.973603 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d"} err="failed to get container status \"74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d\": rpc error: code = NotFound desc = could not find container \"74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d\": container with ID starting with 74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d not found: ID does not exist" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.973628 4867 scope.go:117] "RemoveContainer" containerID="c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.974158 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4"} err="failed to get container status \"c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4\": rpc error: code = NotFound desc = could not find container \"c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4\": container with ID starting with c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4 not found: ID does not exist" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.974181 4867 scope.go:117] "RemoveContainer" containerID="49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.974428 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389"} err="failed to get container status \"49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389\": rpc error: code = NotFound desc = could not find container \"49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389\": container with ID starting with 49f7a114a95779ba44e2a6e75775794ca9058510ed00b3fa143e81b9242de389 not found: ID does not exist" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.974464 4867 scope.go:117] "RemoveContainer" containerID="0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.974720 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe"} err="failed to get container status \"0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe\": rpc error: code = NotFound desc = could not find container \"0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe\": container with ID starting with 0337a431e930d96a297b2c5120ad777d65d4e6a7fda3f8dd0ae8c10ae2411bfe not found: ID does not exist" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.974741 4867 scope.go:117] "RemoveContainer" containerID="74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.974978 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d"} err="failed to get container status \"74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d\": rpc error: code = NotFound desc = could not find container \"74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d\": container with ID starting with 74d7533fa3405481f2048251217b7d17c1dffea262eee5409f7e3d9ba799b74d not found: ID does not exist" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.975011 4867 scope.go:117] "RemoveContainer" containerID="c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4" Oct 06 13:20:42 crc kubenswrapper[4867]: I1006 13:20:42.975213 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4"} err="failed to get container status \"c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4\": rpc error: code = NotFound desc = could not find container \"c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4\": container with ID starting with c58649256750ab34eeef3ba813a908ddb22338880cfecaa4bb38de0d014f82a4 not found: ID does not exist" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.017211 4867 reconciler_common.go:293] "Volume detached for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") on node \"crc\" DevicePath \"\"" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.189303 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.199673 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.288197 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" path="/var/lib/kubelet/pods/2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd/volumes" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.289031 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a5a9728-78a6-4dff-81e9-dc3f2d768220" path="/var/lib/kubelet/pods/5a5a9728-78a6-4dff-81e9-dc3f2d768220/volumes" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.290875 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:20:43 crc kubenswrapper[4867]: E1006 13:20:43.291213 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerName="init-config-reloader" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.291226 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerName="init-config-reloader" Oct 06 13:20:43 crc kubenswrapper[4867]: E1006 13:20:43.291266 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerName="config-reloader" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.291272 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerName="config-reloader" Oct 06 13:20:43 crc kubenswrapper[4867]: E1006 13:20:43.291290 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerName="prometheus" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.291295 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerName="prometheus" Oct 06 13:20:43 crc kubenswrapper[4867]: E1006 13:20:43.291316 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerName="thanos-sidecar" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.291322 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerName="thanos-sidecar" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.291659 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerName="thanos-sidecar" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.291670 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerName="config-reloader" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.291683 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2adfb1f6-0eba-4f81-a44b-0ef26a3b80fd" containerName="prometheus" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.302007 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.302140 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.309843 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.310376 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.310628 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.311094 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.312764 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.313139 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.313206 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-hkmxs" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.425417 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.425489 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.425520 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7g6t\" (UniqueName: \"kubernetes.io/projected/142692d3-42d3-469a-ab1e-e24752dd0b11-kube-api-access-g7g6t\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.425557 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.425585 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-config\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.425696 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.425721 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/142692d3-42d3-469a-ab1e-e24752dd0b11-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.425767 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/142692d3-42d3-469a-ab1e-e24752dd0b11-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.425805 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.425835 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.425875 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/142692d3-42d3-469a-ab1e-e24752dd0b11-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.527722 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.527835 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.527896 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/142692d3-42d3-469a-ab1e-e24752dd0b11-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.527950 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.527998 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.528034 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7g6t\" (UniqueName: \"kubernetes.io/projected/142692d3-42d3-469a-ab1e-e24752dd0b11-kube-api-access-g7g6t\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.528098 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.528135 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-config\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.528171 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.528198 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/142692d3-42d3-469a-ab1e-e24752dd0b11-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.528238 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/142692d3-42d3-469a-ab1e-e24752dd0b11-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.531677 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/142692d3-42d3-469a-ab1e-e24752dd0b11-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.534242 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/142692d3-42d3-469a-ab1e-e24752dd0b11-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.535422 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.535612 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-config\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.535728 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.536269 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.536277 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.538571 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.543881 4867 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.543917 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8b1bb68adf8576a50f9d1afe1558762f141c90adcfe42ae323643ac07b58f5a8/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.549206 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/142692d3-42d3-469a-ab1e-e24752dd0b11-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.551112 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7g6t\" (UniqueName: \"kubernetes.io/projected/142692d3-42d3-469a-ab1e-e24752dd0b11-kube-api-access-g7g6t\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.598590 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") pod \"prometheus-metric-storage-0\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.629624 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.870845 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"dc7c3d6e4a56b47083279ee6ae2284ad88314e1b4502131415d2cb8bf780f5dd"} Oct 06 13:20:43 crc kubenswrapper[4867]: I1006 13:20:43.871327 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"e94df09f57911f2d06f3d166d90e150cf366e897299190fa25758ebde902aca4"} Oct 06 13:20:44 crc kubenswrapper[4867]: I1006 13:20:44.208870 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:20:44 crc kubenswrapper[4867]: I1006 13:20:44.885603 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"142692d3-42d3-469a-ab1e-e24752dd0b11","Type":"ContainerStarted","Data":"f3578a41a1be63f4d3044a21d8333b539c020db3d7bc82567588c1d561871946"} Oct 06 13:20:44 crc kubenswrapper[4867]: I1006 13:20:44.890587 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"b2d53371e52051016b71910faa92136e3fa100ab0d18356461b19c2c1145411c"} Oct 06 13:20:45 crc kubenswrapper[4867]: I1006 13:20:45.905692 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"8850ad4b63d92694e098c802bb2e79e0e4dbaeffeb3049e98b06ba5da071e778"} Oct 06 13:20:45 crc kubenswrapper[4867]: I1006 13:20:45.906198 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"f779ab62a30e4b92a7d8d5daa21ac0b4b53dd18efd9753f226df84c803bd1ff6"} Oct 06 13:20:45 crc kubenswrapper[4867]: I1006 13:20:45.906212 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"b29b7e0abe1ce1f6e5dc53ceb7a5f7a782389c5072786399f238730304cef5b2"} Oct 06 13:20:46 crc kubenswrapper[4867]: I1006 13:20:46.923934 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"b26d26a70f9838e7c4d6ae5201211c9d4dee48ece889268ba5ad7ac0d6a144f7"} Oct 06 13:20:47 crc kubenswrapper[4867]: I1006 13:20:47.942308 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"142692d3-42d3-469a-ab1e-e24752dd0b11","Type":"ContainerStarted","Data":"75d0acf68d2e15dcf7d71ede8da663ced249c84d0fce069f660f6cbdd38884f6"} Oct 06 13:20:47 crc kubenswrapper[4867]: I1006 13:20:47.953103 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"2fb26f1fa439cb167718806204f69a9df9ea270c61531ff224b8780cadf99806"} Oct 06 13:20:47 crc kubenswrapper[4867]: I1006 13:20:47.953150 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"bfbc9307e7a388bc4962e5b71942422470b74a7ede7b0fa3c1fad6f9e74160b1"} Oct 06 13:20:47 crc kubenswrapper[4867]: I1006 13:20:47.953162 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"e47e42efd981d170917e97fdcca0b37ffb5f567425aca1792a791c9ccd436e1a"} Oct 06 13:20:47 crc kubenswrapper[4867]: I1006 13:20:47.953171 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"b3a0222d72336a4cdbeb5fcd726de840a79e727dc7e6af19876ed29f44ee7ae5"} Oct 06 13:20:48 crc kubenswrapper[4867]: I1006 13:20:48.976957 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"18205e1c48bcd3a836272db1121da6ee6bde8226c2a52082e8091b5e7e265fe4"} Oct 06 13:20:48 crc kubenswrapper[4867]: I1006 13:20:48.977007 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"dc7edd17-2d19-4949-8849-9a62cd86e861","Type":"ContainerStarted","Data":"c98d052151b1b3ca16261b2423107d89249fb4dc11a3f88ed73f9a161e41699d"} Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.018114 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.65397471 podStartE2EDuration="42.018091811s" podCreationTimestamp="2025-10-06 13:20:07 +0000 UTC" firstStartedPulling="2025-10-06 13:20:40.962971461 +0000 UTC m=+1020.420919605" lastFinishedPulling="2025-10-06 13:20:46.327088562 +0000 UTC m=+1025.785036706" observedRunningTime="2025-10-06 13:20:49.014454001 +0000 UTC m=+1028.472402135" watchObservedRunningTime="2025-10-06 13:20:49.018091811 +0000 UTC m=+1028.476039955" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.351263 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9fbf94969-mfmxl"] Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.353687 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.360344 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.371856 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9fbf94969-mfmxl"] Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.457182 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-ovsdbserver-nb\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.457293 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-296t8\" (UniqueName: \"kubernetes.io/projected/267fe65a-faaf-40f1-9a41-d44776aa6b53-kube-api-access-296t8\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.457345 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-dns-swift-storage-0\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.457366 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-ovsdbserver-sb\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.457416 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-dns-svc\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.457438 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-config\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.559776 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-296t8\" (UniqueName: \"kubernetes.io/projected/267fe65a-faaf-40f1-9a41-d44776aa6b53-kube-api-access-296t8\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.559874 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-dns-swift-storage-0\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.559902 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-ovsdbserver-sb\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.559929 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-dns-svc\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.559954 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-config\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.559999 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-ovsdbserver-nb\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.561878 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-config\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.562032 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-dns-svc\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.562302 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-ovsdbserver-sb\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.564180 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-dns-swift-storage-0\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.564674 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-ovsdbserver-nb\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.587588 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-296t8\" (UniqueName: \"kubernetes.io/projected/267fe65a-faaf-40f1-9a41-d44776aa6b53-kube-api-access-296t8\") pod \"dnsmasq-dns-9fbf94969-mfmxl\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:49 crc kubenswrapper[4867]: I1006 13:20:49.677213 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:20:50 crc kubenswrapper[4867]: I1006 13:20:50.162818 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9fbf94969-mfmxl"] Oct 06 13:20:51 crc kubenswrapper[4867]: I1006 13:20:51.211501 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="acd7f8b9-810f-4e76-b971-c466bf7d4a5b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Oct 06 13:20:51 crc kubenswrapper[4867]: I1006 13:20:51.558463 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:20:51 crc kubenswrapper[4867]: I1006 13:20:51.919027 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="4beec03b-3d57-4c36-a149-153bb022bd7a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Oct 06 13:20:54 crc kubenswrapper[4867]: I1006 13:20:54.026544 4867 generic.go:334] "Generic (PLEG): container finished" podID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerID="75d0acf68d2e15dcf7d71ede8da663ced249c84d0fce069f660f6cbdd38884f6" exitCode=0 Oct 06 13:20:54 crc kubenswrapper[4867]: I1006 13:20:54.026718 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"142692d3-42d3-469a-ab1e-e24752dd0b11","Type":"ContainerDied","Data":"75d0acf68d2e15dcf7d71ede8da663ced249c84d0fce069f660f6cbdd38884f6"} Oct 06 13:20:58 crc kubenswrapper[4867]: W1006 13:20:58.337634 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod267fe65a_faaf_40f1_9a41_d44776aa6b53.slice/crio-403a12e88b9bf4fba69f34e224d9b6b92c8044a59932060ce54a7a1b78a3b0f0 WatchSource:0}: Error finding container 403a12e88b9bf4fba69f34e224d9b6b92c8044a59932060ce54a7a1b78a3b0f0: Status 404 returned error can't find the container with id 403a12e88b9bf4fba69f34e224d9b6b92c8044a59932060ce54a7a1b78a3b0f0 Oct 06 13:20:59 crc kubenswrapper[4867]: I1006 13:20:59.095574 4867 generic.go:334] "Generic (PLEG): container finished" podID="267fe65a-faaf-40f1-9a41-d44776aa6b53" containerID="30a337ee23fafb35887833b3942c84a4940ddf9d4206f2139fdb778c8e93ee4c" exitCode=0 Oct 06 13:20:59 crc kubenswrapper[4867]: I1006 13:20:59.095755 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" event={"ID":"267fe65a-faaf-40f1-9a41-d44776aa6b53","Type":"ContainerDied","Data":"30a337ee23fafb35887833b3942c84a4940ddf9d4206f2139fdb778c8e93ee4c"} Oct 06 13:20:59 crc kubenswrapper[4867]: I1006 13:20:59.096348 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" event={"ID":"267fe65a-faaf-40f1-9a41-d44776aa6b53","Type":"ContainerStarted","Data":"403a12e88b9bf4fba69f34e224d9b6b92c8044a59932060ce54a7a1b78a3b0f0"} Oct 06 13:20:59 crc kubenswrapper[4867]: I1006 13:20:59.097891 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5kc59" event={"ID":"0efadce0-13dd-4a6d-9a54-9deabd1e8069","Type":"ContainerStarted","Data":"2cd606c92663700319d70ddfe520d03e0d4c8353be64b821591a9e77718e5d30"} Oct 06 13:20:59 crc kubenswrapper[4867]: I1006 13:20:59.100771 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"142692d3-42d3-469a-ab1e-e24752dd0b11","Type":"ContainerStarted","Data":"9f31e63f5b82c4a8df364cafb166381b10241cc8cbe912fe523b493ae6e21951"} Oct 06 13:20:59 crc kubenswrapper[4867]: I1006 13:20:59.142792 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-5kc59" podStartSLOduration=2.216300132 podStartE2EDuration="19.142750278s" podCreationTimestamp="2025-10-06 13:20:40 +0000 UTC" firstStartedPulling="2025-10-06 13:20:41.479065743 +0000 UTC m=+1020.937013887" lastFinishedPulling="2025-10-06 13:20:58.405515879 +0000 UTC m=+1037.863464033" observedRunningTime="2025-10-06 13:20:59.140907058 +0000 UTC m=+1038.598855202" watchObservedRunningTime="2025-10-06 13:20:59.142750278 +0000 UTC m=+1038.600698422" Oct 06 13:21:00 crc kubenswrapper[4867]: I1006 13:21:00.113238 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" event={"ID":"267fe65a-faaf-40f1-9a41-d44776aa6b53","Type":"ContainerStarted","Data":"6b42b29e4d38d08e45440802df8a1e808c08264f146b225e8f1af53eb6d15221"} Oct 06 13:21:00 crc kubenswrapper[4867]: I1006 13:21:00.113313 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:21:00 crc kubenswrapper[4867]: I1006 13:21:00.139524 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" podStartSLOduration=11.139493392 podStartE2EDuration="11.139493392s" podCreationTimestamp="2025-10-06 13:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:21:00.132286285 +0000 UTC m=+1039.590234429" watchObservedRunningTime="2025-10-06 13:21:00.139493392 +0000 UTC m=+1039.597441536" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.127888 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"142692d3-42d3-469a-ab1e-e24752dd0b11","Type":"ContainerStarted","Data":"b53990b0029bf080c6eba2b5ad95272a836e4fb7e74db4d7ead2f1f083d0de16"} Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.210955 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.510194 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-7nb9q"] Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.512213 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7nb9q" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.522835 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7nb9q"] Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.595174 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpkrp\" (UniqueName: \"kubernetes.io/projected/f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418-kube-api-access-lpkrp\") pod \"barbican-db-create-7nb9q\" (UID: \"f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418\") " pod="openstack/barbican-db-create-7nb9q" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.685337 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-895bn"] Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.686605 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-895bn" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.696613 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-895bn"] Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.699902 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpkrp\" (UniqueName: \"kubernetes.io/projected/f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418-kube-api-access-lpkrp\") pod \"barbican-db-create-7nb9q\" (UID: \"f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418\") " pod="openstack/barbican-db-create-7nb9q" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.758759 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpkrp\" (UniqueName: \"kubernetes.io/projected/f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418-kube-api-access-lpkrp\") pod \"barbican-db-create-7nb9q\" (UID: \"f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418\") " pod="openstack/barbican-db-create-7nb9q" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.780920 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-nzl2m"] Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.783572 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nzl2m" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.795670 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nzl2m"] Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.801270 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnzg8\" (UniqueName: \"kubernetes.io/projected/32ed8ecc-126e-40f7-b923-5e26dacacb06-kube-api-access-lnzg8\") pod \"cinder-db-create-895bn\" (UID: \"32ed8ecc-126e-40f7-b923-5e26dacacb06\") " pod="openstack/cinder-db-create-895bn" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.845067 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-9rnv4"] Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.846391 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9rnv4" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.847515 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7nb9q" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.853986 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.854244 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.854338 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.854267 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sjvr2" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.860948 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9rnv4"] Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.903696 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t86tp\" (UniqueName: \"kubernetes.io/projected/876182a1-a170-4101-b8ce-041060a74555-kube-api-access-t86tp\") pod \"neutron-db-create-nzl2m\" (UID: \"876182a1-a170-4101-b8ce-041060a74555\") " pod="openstack/neutron-db-create-nzl2m" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.903784 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnzg8\" (UniqueName: \"kubernetes.io/projected/32ed8ecc-126e-40f7-b923-5e26dacacb06-kube-api-access-lnzg8\") pod \"cinder-db-create-895bn\" (UID: \"32ed8ecc-126e-40f7-b923-5e26dacacb06\") " pod="openstack/cinder-db-create-895bn" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.920560 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Oct 06 13:21:01 crc kubenswrapper[4867]: I1006 13:21:01.923104 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnzg8\" (UniqueName: \"kubernetes.io/projected/32ed8ecc-126e-40f7-b923-5e26dacacb06-kube-api-access-lnzg8\") pod \"cinder-db-create-895bn\" (UID: \"32ed8ecc-126e-40f7-b923-5e26dacacb06\") " pod="openstack/cinder-db-create-895bn" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.005223 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c55d65d-f40c-402b-895b-fdf4000fdf33-combined-ca-bundle\") pod \"keystone-db-sync-9rnv4\" (UID: \"3c55d65d-f40c-402b-895b-fdf4000fdf33\") " pod="openstack/keystone-db-sync-9rnv4" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.005808 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c55d65d-f40c-402b-895b-fdf4000fdf33-config-data\") pod \"keystone-db-sync-9rnv4\" (UID: \"3c55d65d-f40c-402b-895b-fdf4000fdf33\") " pod="openstack/keystone-db-sync-9rnv4" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.005838 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dw7v\" (UniqueName: \"kubernetes.io/projected/3c55d65d-f40c-402b-895b-fdf4000fdf33-kube-api-access-2dw7v\") pod \"keystone-db-sync-9rnv4\" (UID: \"3c55d65d-f40c-402b-895b-fdf4000fdf33\") " pod="openstack/keystone-db-sync-9rnv4" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.005981 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t86tp\" (UniqueName: \"kubernetes.io/projected/876182a1-a170-4101-b8ce-041060a74555-kube-api-access-t86tp\") pod \"neutron-db-create-nzl2m\" (UID: \"876182a1-a170-4101-b8ce-041060a74555\") " pod="openstack/neutron-db-create-nzl2m" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.008829 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-895bn" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.030276 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t86tp\" (UniqueName: \"kubernetes.io/projected/876182a1-a170-4101-b8ce-041060a74555-kube-api-access-t86tp\") pod \"neutron-db-create-nzl2m\" (UID: \"876182a1-a170-4101-b8ce-041060a74555\") " pod="openstack/neutron-db-create-nzl2m" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.108268 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c55d65d-f40c-402b-895b-fdf4000fdf33-config-data\") pod \"keystone-db-sync-9rnv4\" (UID: \"3c55d65d-f40c-402b-895b-fdf4000fdf33\") " pod="openstack/keystone-db-sync-9rnv4" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.108319 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dw7v\" (UniqueName: \"kubernetes.io/projected/3c55d65d-f40c-402b-895b-fdf4000fdf33-kube-api-access-2dw7v\") pod \"keystone-db-sync-9rnv4\" (UID: \"3c55d65d-f40c-402b-895b-fdf4000fdf33\") " pod="openstack/keystone-db-sync-9rnv4" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.108440 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c55d65d-f40c-402b-895b-fdf4000fdf33-combined-ca-bundle\") pod \"keystone-db-sync-9rnv4\" (UID: \"3c55d65d-f40c-402b-895b-fdf4000fdf33\") " pod="openstack/keystone-db-sync-9rnv4" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.109607 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nzl2m" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.112071 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c55d65d-f40c-402b-895b-fdf4000fdf33-config-data\") pod \"keystone-db-sync-9rnv4\" (UID: \"3c55d65d-f40c-402b-895b-fdf4000fdf33\") " pod="openstack/keystone-db-sync-9rnv4" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.112909 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c55d65d-f40c-402b-895b-fdf4000fdf33-combined-ca-bundle\") pod \"keystone-db-sync-9rnv4\" (UID: \"3c55d65d-f40c-402b-895b-fdf4000fdf33\") " pod="openstack/keystone-db-sync-9rnv4" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.131143 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dw7v\" (UniqueName: \"kubernetes.io/projected/3c55d65d-f40c-402b-895b-fdf4000fdf33-kube-api-access-2dw7v\") pod \"keystone-db-sync-9rnv4\" (UID: \"3c55d65d-f40c-402b-895b-fdf4000fdf33\") " pod="openstack/keystone-db-sync-9rnv4" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.150352 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"142692d3-42d3-469a-ab1e-e24752dd0b11","Type":"ContainerStarted","Data":"064974d3fb6315a201c9f6a24226589e556c74f259c14d0eb8d3c898db9995d7"} Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.176832 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9rnv4" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.194952 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.194934514 podStartE2EDuration="19.194934514s" podCreationTimestamp="2025-10-06 13:20:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:21:02.19003115 +0000 UTC m=+1041.647979374" watchObservedRunningTime="2025-10-06 13:21:02.194934514 +0000 UTC m=+1041.652882658" Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.436971 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7nb9q"] Oct 06 13:21:02 crc kubenswrapper[4867]: W1006 13:21:02.444185 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8fdd1b3_bafe_4fa9_a7c0_ac2d35bb5418.slice/crio-3e5d573d8c2c2716668dae13b0d1445799ad53e7ae0edd3808a0d9323fc3a31b WatchSource:0}: Error finding container 3e5d573d8c2c2716668dae13b0d1445799ad53e7ae0edd3808a0d9323fc3a31b: Status 404 returned error can't find the container with id 3e5d573d8c2c2716668dae13b0d1445799ad53e7ae0edd3808a0d9323fc3a31b Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.506028 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-895bn"] Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.841911 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9rnv4"] Oct 06 13:21:02 crc kubenswrapper[4867]: I1006 13:21:02.850633 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nzl2m"] Oct 06 13:21:02 crc kubenswrapper[4867]: W1006 13:21:02.927699 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c55d65d_f40c_402b_895b_fdf4000fdf33.slice/crio-389650a8ffc662ba1f1ea4b9e68cdd306ef72daf2b92d5edd1f16da21289f6f8 WatchSource:0}: Error finding container 389650a8ffc662ba1f1ea4b9e68cdd306ef72daf2b92d5edd1f16da21289f6f8: Status 404 returned error can't find the container with id 389650a8ffc662ba1f1ea4b9e68cdd306ef72daf2b92d5edd1f16da21289f6f8 Oct 06 13:21:02 crc kubenswrapper[4867]: W1006 13:21:02.967408 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod876182a1_a170_4101_b8ce_041060a74555.slice/crio-f3d8c62fc7d9bdd80072f29dab33b0b7bcc060f21dc6e7152e0525c462ccf477 WatchSource:0}: Error finding container f3d8c62fc7d9bdd80072f29dab33b0b7bcc060f21dc6e7152e0525c462ccf477: Status 404 returned error can't find the container with id f3d8c62fc7d9bdd80072f29dab33b0b7bcc060f21dc6e7152e0525c462ccf477 Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.173703 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9rnv4" event={"ID":"3c55d65d-f40c-402b-895b-fdf4000fdf33","Type":"ContainerStarted","Data":"389650a8ffc662ba1f1ea4b9e68cdd306ef72daf2b92d5edd1f16da21289f6f8"} Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.181748 4867 generic.go:334] "Generic (PLEG): container finished" podID="f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418" containerID="df56205b40bac95fae4350af2aeb4267392a2786e22910736a310118cc729475" exitCode=0 Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.181823 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7nb9q" event={"ID":"f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418","Type":"ContainerDied","Data":"df56205b40bac95fae4350af2aeb4267392a2786e22910736a310118cc729475"} Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.181854 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7nb9q" event={"ID":"f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418","Type":"ContainerStarted","Data":"3e5d573d8c2c2716668dae13b0d1445799ad53e7ae0edd3808a0d9323fc3a31b"} Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.187563 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nzl2m" event={"ID":"876182a1-a170-4101-b8ce-041060a74555","Type":"ContainerStarted","Data":"f3d8c62fc7d9bdd80072f29dab33b0b7bcc060f21dc6e7152e0525c462ccf477"} Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.200407 4867 generic.go:334] "Generic (PLEG): container finished" podID="32ed8ecc-126e-40f7-b923-5e26dacacb06" containerID="7ebc760aac1ef5691ff1339dd8c13a2c73d505d8fd35a5c013a97e14ac7bd4fd" exitCode=0 Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.200503 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-895bn" event={"ID":"32ed8ecc-126e-40f7-b923-5e26dacacb06","Type":"ContainerDied","Data":"7ebc760aac1ef5691ff1339dd8c13a2c73d505d8fd35a5c013a97e14ac7bd4fd"} Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.200540 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-895bn" event={"ID":"32ed8ecc-126e-40f7-b923-5e26dacacb06","Type":"ContainerStarted","Data":"ade4cc66348019ecd152158a0d71f480afc8f396721cf55e69953838f2f5845f"} Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.509944 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-jfv8d"] Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.511002 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.514578 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-2dx8w" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.514736 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.569171 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-jfv8d"] Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.630401 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.661905 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx27l\" (UniqueName: \"kubernetes.io/projected/08af8192-3b42-4ae6-85c1-e12ab46ed88a-kube-api-access-tx27l\") pod \"watcher-db-sync-jfv8d\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.661985 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-config-data\") pod \"watcher-db-sync-jfv8d\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.662047 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-combined-ca-bundle\") pod \"watcher-db-sync-jfv8d\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.662075 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-db-sync-config-data\") pod \"watcher-db-sync-jfv8d\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.764114 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-config-data\") pod \"watcher-db-sync-jfv8d\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.764212 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-combined-ca-bundle\") pod \"watcher-db-sync-jfv8d\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.764244 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-db-sync-config-data\") pod \"watcher-db-sync-jfv8d\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.764414 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx27l\" (UniqueName: \"kubernetes.io/projected/08af8192-3b42-4ae6-85c1-e12ab46ed88a-kube-api-access-tx27l\") pod \"watcher-db-sync-jfv8d\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.771788 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-db-sync-config-data\") pod \"watcher-db-sync-jfv8d\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.772159 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-config-data\") pod \"watcher-db-sync-jfv8d\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.779204 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-combined-ca-bundle\") pod \"watcher-db-sync-jfv8d\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.779613 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx27l\" (UniqueName: \"kubernetes.io/projected/08af8192-3b42-4ae6-85c1-e12ab46ed88a-kube-api-access-tx27l\") pod \"watcher-db-sync-jfv8d\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:03 crc kubenswrapper[4867]: I1006 13:21:03.860535 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:04 crc kubenswrapper[4867]: I1006 13:21:04.214272 4867 generic.go:334] "Generic (PLEG): container finished" podID="876182a1-a170-4101-b8ce-041060a74555" containerID="4d2f7ac48e5358987e58bbefe0b0670e046ee3876cc0bea663dab58bf9152451" exitCode=0 Oct 06 13:21:04 crc kubenswrapper[4867]: I1006 13:21:04.214823 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nzl2m" event={"ID":"876182a1-a170-4101-b8ce-041060a74555","Type":"ContainerDied","Data":"4d2f7ac48e5358987e58bbefe0b0670e046ee3876cc0bea663dab58bf9152451"} Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:04.367981 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-jfv8d"] Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:04.683454 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:04.744153 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fcb78fdc-l7fqs"] Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:04.746866 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" podUID="c204022e-4511-4065-a43e-fa8ef02e2768" containerName="dnsmasq-dns" containerID="cri-o://7cf5eecfbca7e8a521c770cf251dd885401a1565a4d8277385605ca5e82978db" gracePeriod=10 Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:04.801953 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-895bn" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:04.809705 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7nb9q" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:04.890242 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpkrp\" (UniqueName: \"kubernetes.io/projected/f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418-kube-api-access-lpkrp\") pod \"f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418\" (UID: \"f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418\") " Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:04.890437 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnzg8\" (UniqueName: \"kubernetes.io/projected/32ed8ecc-126e-40f7-b923-5e26dacacb06-kube-api-access-lnzg8\") pod \"32ed8ecc-126e-40f7-b923-5e26dacacb06\" (UID: \"32ed8ecc-126e-40f7-b923-5e26dacacb06\") " Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:04.901085 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418-kube-api-access-lpkrp" (OuterVolumeSpecName: "kube-api-access-lpkrp") pod "f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418" (UID: "f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418"). InnerVolumeSpecName "kube-api-access-lpkrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:04.901208 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32ed8ecc-126e-40f7-b923-5e26dacacb06-kube-api-access-lnzg8" (OuterVolumeSpecName: "kube-api-access-lnzg8") pod "32ed8ecc-126e-40f7-b923-5e26dacacb06" (UID: "32ed8ecc-126e-40f7-b923-5e26dacacb06"). InnerVolumeSpecName "kube-api-access-lnzg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:04.992610 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnzg8\" (UniqueName: \"kubernetes.io/projected/32ed8ecc-126e-40f7-b923-5e26dacacb06-kube-api-access-lnzg8\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:04.992648 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpkrp\" (UniqueName: \"kubernetes.io/projected/f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418-kube-api-access-lpkrp\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.269813 4867 generic.go:334] "Generic (PLEG): container finished" podID="c204022e-4511-4065-a43e-fa8ef02e2768" containerID="7cf5eecfbca7e8a521c770cf251dd885401a1565a4d8277385605ca5e82978db" exitCode=0 Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.270326 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" event={"ID":"c204022e-4511-4065-a43e-fa8ef02e2768","Type":"ContainerDied","Data":"7cf5eecfbca7e8a521c770cf251dd885401a1565a4d8277385605ca5e82978db"} Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.282403 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-895bn" event={"ID":"32ed8ecc-126e-40f7-b923-5e26dacacb06","Type":"ContainerDied","Data":"ade4cc66348019ecd152158a0d71f480afc8f396721cf55e69953838f2f5845f"} Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.282452 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ade4cc66348019ecd152158a0d71f480afc8f396721cf55e69953838f2f5845f" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.282521 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-895bn" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.285577 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7nb9q" event={"ID":"f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418","Type":"ContainerDied","Data":"3e5d573d8c2c2716668dae13b0d1445799ad53e7ae0edd3808a0d9323fc3a31b"} Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.285643 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e5d573d8c2c2716668dae13b0d1445799ad53e7ae0edd3808a0d9323fc3a31b" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.285598 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7nb9q" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.288545 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-jfv8d" event={"ID":"08af8192-3b42-4ae6-85c1-e12ab46ed88a","Type":"ContainerStarted","Data":"c2f64e61b411fea4645e1b7a3fa2ee4b88fe0241b647aa0c86f25e389c35588d"} Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.359712 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.505376 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-config\") pod \"c204022e-4511-4065-a43e-fa8ef02e2768\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.505437 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-ovsdbserver-nb\") pod \"c204022e-4511-4065-a43e-fa8ef02e2768\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.509404 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-dns-svc\") pod \"c204022e-4511-4065-a43e-fa8ef02e2768\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.509512 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72lk4\" (UniqueName: \"kubernetes.io/projected/c204022e-4511-4065-a43e-fa8ef02e2768-kube-api-access-72lk4\") pod \"c204022e-4511-4065-a43e-fa8ef02e2768\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.509625 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-ovsdbserver-sb\") pod \"c204022e-4511-4065-a43e-fa8ef02e2768\" (UID: \"c204022e-4511-4065-a43e-fa8ef02e2768\") " Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.515170 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c204022e-4511-4065-a43e-fa8ef02e2768-kube-api-access-72lk4" (OuterVolumeSpecName: "kube-api-access-72lk4") pod "c204022e-4511-4065-a43e-fa8ef02e2768" (UID: "c204022e-4511-4065-a43e-fa8ef02e2768"). InnerVolumeSpecName "kube-api-access-72lk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.587664 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c204022e-4511-4065-a43e-fa8ef02e2768" (UID: "c204022e-4511-4065-a43e-fa8ef02e2768"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.589734 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c204022e-4511-4065-a43e-fa8ef02e2768" (UID: "c204022e-4511-4065-a43e-fa8ef02e2768"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.615411 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72lk4\" (UniqueName: \"kubernetes.io/projected/c204022e-4511-4065-a43e-fa8ef02e2768-kube-api-access-72lk4\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.615446 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.615456 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.648052 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c204022e-4511-4065-a43e-fa8ef02e2768" (UID: "c204022e-4511-4065-a43e-fa8ef02e2768"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.659230 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-config" (OuterVolumeSpecName: "config") pod "c204022e-4511-4065-a43e-fa8ef02e2768" (UID: "c204022e-4511-4065-a43e-fa8ef02e2768"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.720981 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.721465 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c204022e-4511-4065-a43e-fa8ef02e2768-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.774925 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nzl2m" Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.928291 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t86tp\" (UniqueName: \"kubernetes.io/projected/876182a1-a170-4101-b8ce-041060a74555-kube-api-access-t86tp\") pod \"876182a1-a170-4101-b8ce-041060a74555\" (UID: \"876182a1-a170-4101-b8ce-041060a74555\") " Oct 06 13:21:05 crc kubenswrapper[4867]: I1006 13:21:05.935957 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/876182a1-a170-4101-b8ce-041060a74555-kube-api-access-t86tp" (OuterVolumeSpecName: "kube-api-access-t86tp") pod "876182a1-a170-4101-b8ce-041060a74555" (UID: "876182a1-a170-4101-b8ce-041060a74555"). InnerVolumeSpecName "kube-api-access-t86tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:06 crc kubenswrapper[4867]: I1006 13:21:06.030241 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t86tp\" (UniqueName: \"kubernetes.io/projected/876182a1-a170-4101-b8ce-041060a74555-kube-api-access-t86tp\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:06 crc kubenswrapper[4867]: I1006 13:21:06.324888 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nzl2m" event={"ID":"876182a1-a170-4101-b8ce-041060a74555","Type":"ContainerDied","Data":"f3d8c62fc7d9bdd80072f29dab33b0b7bcc060f21dc6e7152e0525c462ccf477"} Oct 06 13:21:06 crc kubenswrapper[4867]: I1006 13:21:06.326561 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3d8c62fc7d9bdd80072f29dab33b0b7bcc060f21dc6e7152e0525c462ccf477" Oct 06 13:21:06 crc kubenswrapper[4867]: I1006 13:21:06.324930 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nzl2m" Oct 06 13:21:06 crc kubenswrapper[4867]: I1006 13:21:06.331932 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" event={"ID":"c204022e-4511-4065-a43e-fa8ef02e2768","Type":"ContainerDied","Data":"66f86b4f88d8830bb355670e305fdfc2291e0bbe4ab85f5db7e71e45832dd172"} Oct 06 13:21:06 crc kubenswrapper[4867]: I1006 13:21:06.332001 4867 scope.go:117] "RemoveContainer" containerID="7cf5eecfbca7e8a521c770cf251dd885401a1565a4d8277385605ca5e82978db" Oct 06 13:21:06 crc kubenswrapper[4867]: I1006 13:21:06.332236 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67fcb78fdc-l7fqs" Oct 06 13:21:06 crc kubenswrapper[4867]: I1006 13:21:06.384309 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67fcb78fdc-l7fqs"] Oct 06 13:21:06 crc kubenswrapper[4867]: I1006 13:21:06.396135 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67fcb78fdc-l7fqs"] Oct 06 13:21:07 crc kubenswrapper[4867]: I1006 13:21:07.232057 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c204022e-4511-4065-a43e-fa8ef02e2768" path="/var/lib/kubelet/pods/c204022e-4511-4065-a43e-fa8ef02e2768/volumes" Oct 06 13:21:08 crc kubenswrapper[4867]: I1006 13:21:08.358574 4867 generic.go:334] "Generic (PLEG): container finished" podID="0efadce0-13dd-4a6d-9a54-9deabd1e8069" containerID="2cd606c92663700319d70ddfe520d03e0d4c8353be64b821591a9e77718e5d30" exitCode=0 Oct 06 13:21:08 crc kubenswrapper[4867]: I1006 13:21:08.358638 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5kc59" event={"ID":"0efadce0-13dd-4a6d-9a54-9deabd1e8069","Type":"ContainerDied","Data":"2cd606c92663700319d70ddfe520d03e0d4c8353be64b821591a9e77718e5d30"} Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.554476 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-53a2-account-create-wfj5z"] Oct 06 13:21:11 crc kubenswrapper[4867]: E1006 13:21:11.555605 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c204022e-4511-4065-a43e-fa8ef02e2768" containerName="dnsmasq-dns" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.555621 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c204022e-4511-4065-a43e-fa8ef02e2768" containerName="dnsmasq-dns" Oct 06 13:21:11 crc kubenswrapper[4867]: E1006 13:21:11.559967 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418" containerName="mariadb-database-create" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.560001 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418" containerName="mariadb-database-create" Oct 06 13:21:11 crc kubenswrapper[4867]: E1006 13:21:11.560018 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c204022e-4511-4065-a43e-fa8ef02e2768" containerName="init" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.560026 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c204022e-4511-4065-a43e-fa8ef02e2768" containerName="init" Oct 06 13:21:11 crc kubenswrapper[4867]: E1006 13:21:11.560045 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ed8ecc-126e-40f7-b923-5e26dacacb06" containerName="mariadb-database-create" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.560051 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ed8ecc-126e-40f7-b923-5e26dacacb06" containerName="mariadb-database-create" Oct 06 13:21:11 crc kubenswrapper[4867]: E1006 13:21:11.560074 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="876182a1-a170-4101-b8ce-041060a74555" containerName="mariadb-database-create" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.560081 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="876182a1-a170-4101-b8ce-041060a74555" containerName="mariadb-database-create" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.560428 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="876182a1-a170-4101-b8ce-041060a74555" containerName="mariadb-database-create" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.560457 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418" containerName="mariadb-database-create" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.560471 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c204022e-4511-4065-a43e-fa8ef02e2768" containerName="dnsmasq-dns" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.560482 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ed8ecc-126e-40f7-b923-5e26dacacb06" containerName="mariadb-database-create" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.561221 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-53a2-account-create-wfj5z" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.563238 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-53a2-account-create-wfj5z"] Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.564554 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.670850 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd4zq\" (UniqueName: \"kubernetes.io/projected/7cbf821b-c375-4824-95ea-d8774ffb7486-kube-api-access-cd4zq\") pod \"barbican-53a2-account-create-wfj5z\" (UID: \"7cbf821b-c375-4824-95ea-d8774ffb7486\") " pod="openstack/barbican-53a2-account-create-wfj5z" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.745319 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-23c7-account-create-w2ndn"] Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.747062 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-23c7-account-create-w2ndn" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.749941 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.760702 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-23c7-account-create-w2ndn"] Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.772009 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd4zq\" (UniqueName: \"kubernetes.io/projected/7cbf821b-c375-4824-95ea-d8774ffb7486-kube-api-access-cd4zq\") pod \"barbican-53a2-account-create-wfj5z\" (UID: \"7cbf821b-c375-4824-95ea-d8774ffb7486\") " pod="openstack/barbican-53a2-account-create-wfj5z" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.790837 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd4zq\" (UniqueName: \"kubernetes.io/projected/7cbf821b-c375-4824-95ea-d8774ffb7486-kube-api-access-cd4zq\") pod \"barbican-53a2-account-create-wfj5z\" (UID: \"7cbf821b-c375-4824-95ea-d8774ffb7486\") " pod="openstack/barbican-53a2-account-create-wfj5z" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.874076 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hl57\" (UniqueName: \"kubernetes.io/projected/b3e2c571-fd42-4b41-b31e-4774988cfb31-kube-api-access-2hl57\") pod \"cinder-23c7-account-create-w2ndn\" (UID: \"b3e2c571-fd42-4b41-b31e-4774988cfb31\") " pod="openstack/cinder-23c7-account-create-w2ndn" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.891703 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-53a2-account-create-wfj5z" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.952769 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ef38-account-create-9qrb7"] Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.954145 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ef38-account-create-9qrb7" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.956326 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.969503 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ef38-account-create-9qrb7"] Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.975659 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hl57\" (UniqueName: \"kubernetes.io/projected/b3e2c571-fd42-4b41-b31e-4774988cfb31-kube-api-access-2hl57\") pod \"cinder-23c7-account-create-w2ndn\" (UID: \"b3e2c571-fd42-4b41-b31e-4774988cfb31\") " pod="openstack/cinder-23c7-account-create-w2ndn" Oct 06 13:21:11 crc kubenswrapper[4867]: I1006 13:21:11.996693 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hl57\" (UniqueName: \"kubernetes.io/projected/b3e2c571-fd42-4b41-b31e-4774988cfb31-kube-api-access-2hl57\") pod \"cinder-23c7-account-create-w2ndn\" (UID: \"b3e2c571-fd42-4b41-b31e-4774988cfb31\") " pod="openstack/cinder-23c7-account-create-w2ndn" Oct 06 13:21:12 crc kubenswrapper[4867]: I1006 13:21:12.077476 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8trk\" (UniqueName: \"kubernetes.io/projected/2ea503d8-6da7-4349-b16c-85e3e66a9f9e-kube-api-access-n8trk\") pod \"neutron-ef38-account-create-9qrb7\" (UID: \"2ea503d8-6da7-4349-b16c-85e3e66a9f9e\") " pod="openstack/neutron-ef38-account-create-9qrb7" Oct 06 13:21:12 crc kubenswrapper[4867]: I1006 13:21:12.082639 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-23c7-account-create-w2ndn" Oct 06 13:21:12 crc kubenswrapper[4867]: I1006 13:21:12.179385 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8trk\" (UniqueName: \"kubernetes.io/projected/2ea503d8-6da7-4349-b16c-85e3e66a9f9e-kube-api-access-n8trk\") pod \"neutron-ef38-account-create-9qrb7\" (UID: \"2ea503d8-6da7-4349-b16c-85e3e66a9f9e\") " pod="openstack/neutron-ef38-account-create-9qrb7" Oct 06 13:21:12 crc kubenswrapper[4867]: I1006 13:21:12.218910 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8trk\" (UniqueName: \"kubernetes.io/projected/2ea503d8-6da7-4349-b16c-85e3e66a9f9e-kube-api-access-n8trk\") pod \"neutron-ef38-account-create-9qrb7\" (UID: \"2ea503d8-6da7-4349-b16c-85e3e66a9f9e\") " pod="openstack/neutron-ef38-account-create-9qrb7" Oct 06 13:21:12 crc kubenswrapper[4867]: I1006 13:21:12.282929 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ef38-account-create-9qrb7" Oct 06 13:21:12 crc kubenswrapper[4867]: I1006 13:21:12.628326 4867 scope.go:117] "RemoveContainer" containerID="a4337dfc6f67a65ee0ce8a3f62d04fe3fe8008d640f64cc79966fd49415f6605" Oct 06 13:21:12 crc kubenswrapper[4867]: I1006 13:21:12.873674 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:21:12 crc kubenswrapper[4867]: I1006 13:21:12.873781 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:21:12 crc kubenswrapper[4867]: I1006 13:21:12.873875 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:21:12 crc kubenswrapper[4867]: I1006 13:21:12.874895 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a46508d237859c347210237945b8f376811db88e9f318300207a6c9aaeafb5d"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:21:12 crc kubenswrapper[4867]: I1006 13:21:12.875006 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://0a46508d237859c347210237945b8f376811db88e9f318300207a6c9aaeafb5d" gracePeriod=600 Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.384980 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5kc59" Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.417984 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="0a46508d237859c347210237945b8f376811db88e9f318300207a6c9aaeafb5d" exitCode=0 Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.418045 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"0a46508d237859c347210237945b8f376811db88e9f318300207a6c9aaeafb5d"} Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.418109 4867 scope.go:117] "RemoveContainer" containerID="21254038d2e08625414f4e3fd77d4aa603650bf9aa5cea1080c49abec73a2651" Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.422610 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5kc59" event={"ID":"0efadce0-13dd-4a6d-9a54-9deabd1e8069","Type":"ContainerDied","Data":"d689693337e4946b42114cdb55f0dc0fb02689a720e8692afb40a2667cde3046"} Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.422665 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d689693337e4946b42114cdb55f0dc0fb02689a720e8692afb40a2667cde3046" Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.422762 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5kc59" Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.506634 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-config-data\") pod \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.506702 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-db-sync-config-data\") pod \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.506749 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-combined-ca-bundle\") pod \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.506841 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sm5t\" (UniqueName: \"kubernetes.io/projected/0efadce0-13dd-4a6d-9a54-9deabd1e8069-kube-api-access-2sm5t\") pod \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\" (UID: \"0efadce0-13dd-4a6d-9a54-9deabd1e8069\") " Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.515481 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0efadce0-13dd-4a6d-9a54-9deabd1e8069" (UID: "0efadce0-13dd-4a6d-9a54-9deabd1e8069"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.515901 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0efadce0-13dd-4a6d-9a54-9deabd1e8069-kube-api-access-2sm5t" (OuterVolumeSpecName: "kube-api-access-2sm5t") pod "0efadce0-13dd-4a6d-9a54-9deabd1e8069" (UID: "0efadce0-13dd-4a6d-9a54-9deabd1e8069"). InnerVolumeSpecName "kube-api-access-2sm5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.544671 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0efadce0-13dd-4a6d-9a54-9deabd1e8069" (UID: "0efadce0-13dd-4a6d-9a54-9deabd1e8069"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.572217 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-config-data" (OuterVolumeSpecName: "config-data") pod "0efadce0-13dd-4a6d-9a54-9deabd1e8069" (UID: "0efadce0-13dd-4a6d-9a54-9deabd1e8069"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.615903 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-23c7-account-create-w2ndn"] Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.616269 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.616315 4867 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.616332 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0efadce0-13dd-4a6d-9a54-9deabd1e8069-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.616346 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sm5t\" (UniqueName: \"kubernetes.io/projected/0efadce0-13dd-4a6d-9a54-9deabd1e8069-kube-api-access-2sm5t\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.631810 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.638064 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.753543 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ef38-account-create-9qrb7"] Oct 06 13:21:13 crc kubenswrapper[4867]: W1006 13:21:13.754316 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cbf821b_c375_4824_95ea_d8774ffb7486.slice/crio-2139a5069ccc9ee42f5016994c0b18f6fc6341fadc85ebf32d70de4d51d5ca06 WatchSource:0}: Error finding container 2139a5069ccc9ee42f5016994c0b18f6fc6341fadc85ebf32d70de4d51d5ca06: Status 404 returned error can't find the container with id 2139a5069ccc9ee42f5016994c0b18f6fc6341fadc85ebf32d70de4d51d5ca06 Oct 06 13:21:13 crc kubenswrapper[4867]: I1006 13:21:13.760065 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-53a2-account-create-wfj5z"] Oct 06 13:21:13 crc kubenswrapper[4867]: W1006 13:21:13.766472 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ea503d8_6da7_4349_b16c_85e3e66a9f9e.slice/crio-3c653d01b3692dcf7f65a3cb261ff159592f2c22410bfa0842107f307bde1fa7 WatchSource:0}: Error finding container 3c653d01b3692dcf7f65a3cb261ff159592f2c22410bfa0842107f307bde1fa7: Status 404 returned error can't find the container with id 3c653d01b3692dcf7f65a3cb261ff159592f2c22410bfa0842107f307bde1fa7 Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.437076 4867 generic.go:334] "Generic (PLEG): container finished" podID="b3e2c571-fd42-4b41-b31e-4774988cfb31" containerID="3d59006217d14faf12d3123f98b6d12b667aafcdfc0ed2c3dfbcfb9d24cfe57d" exitCode=0 Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.437152 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-23c7-account-create-w2ndn" event={"ID":"b3e2c571-fd42-4b41-b31e-4774988cfb31","Type":"ContainerDied","Data":"3d59006217d14faf12d3123f98b6d12b667aafcdfc0ed2c3dfbcfb9d24cfe57d"} Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.437539 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-23c7-account-create-w2ndn" event={"ID":"b3e2c571-fd42-4b41-b31e-4774988cfb31","Type":"ContainerStarted","Data":"6a6d1fe9c6cdf9a6940754b63613b3691f7e0993d09d00a6be1037350b9f535a"} Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.439384 4867 generic.go:334] "Generic (PLEG): container finished" podID="2ea503d8-6da7-4349-b16c-85e3e66a9f9e" containerID="5ac16da2ba117970df0cf4e089c407ddb76be0931c4331f7e9e7a92ed5283470" exitCode=0 Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.439574 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ef38-account-create-9qrb7" event={"ID":"2ea503d8-6da7-4349-b16c-85e3e66a9f9e","Type":"ContainerDied","Data":"5ac16da2ba117970df0cf4e089c407ddb76be0931c4331f7e9e7a92ed5283470"} Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.439610 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ef38-account-create-9qrb7" event={"ID":"2ea503d8-6da7-4349-b16c-85e3e66a9f9e","Type":"ContainerStarted","Data":"3c653d01b3692dcf7f65a3cb261ff159592f2c22410bfa0842107f307bde1fa7"} Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.443383 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9rnv4" event={"ID":"3c55d65d-f40c-402b-895b-fdf4000fdf33","Type":"ContainerStarted","Data":"2699f945efc447e5da282252d22177c88eea10e9f7b730286a2b81c9e8b50eb9"} Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.456903 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"266184608e50b4d6729b714d56d0cdb437a575eeec8c7e5f18126b05fc5a103e"} Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.459571 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-jfv8d" event={"ID":"08af8192-3b42-4ae6-85c1-e12ab46ed88a","Type":"ContainerStarted","Data":"92ba3ad6abe82e07d130480afe543510a84824dbb9d6e46c52ba0429207ec901"} Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.466217 4867 generic.go:334] "Generic (PLEG): container finished" podID="7cbf821b-c375-4824-95ea-d8774ffb7486" containerID="e5d268946da403efe40890394102b5786ecef49fb9dd02be60e4a623e31b220d" exitCode=0 Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.467207 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-53a2-account-create-wfj5z" event={"ID":"7cbf821b-c375-4824-95ea-d8774ffb7486","Type":"ContainerDied","Data":"e5d268946da403efe40890394102b5786ecef49fb9dd02be60e4a623e31b220d"} Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.467272 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-53a2-account-create-wfj5z" event={"ID":"7cbf821b-c375-4824-95ea-d8774ffb7486","Type":"ContainerStarted","Data":"2139a5069ccc9ee42f5016994c0b18f6fc6341fadc85ebf32d70de4d51d5ca06"} Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.475684 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.478331 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-9rnv4" podStartSLOduration=3.242275915 podStartE2EDuration="13.478307286s" podCreationTimestamp="2025-10-06 13:21:01 +0000 UTC" firstStartedPulling="2025-10-06 13:21:02.930832726 +0000 UTC m=+1042.388780870" lastFinishedPulling="2025-10-06 13:21:13.166864097 +0000 UTC m=+1052.624812241" observedRunningTime="2025-10-06 13:21:14.470698668 +0000 UTC m=+1053.928646812" watchObservedRunningTime="2025-10-06 13:21:14.478307286 +0000 UTC m=+1053.936255430" Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.566526 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-jfv8d" podStartSLOduration=2.759363356 podStartE2EDuration="11.566500817s" podCreationTimestamp="2025-10-06 13:21:03 +0000 UTC" firstStartedPulling="2025-10-06 13:21:04.385359227 +0000 UTC m=+1043.843307371" lastFinishedPulling="2025-10-06 13:21:13.192496688 +0000 UTC m=+1052.650444832" observedRunningTime="2025-10-06 13:21:14.563844134 +0000 UTC m=+1054.021792278" watchObservedRunningTime="2025-10-06 13:21:14.566500817 +0000 UTC m=+1054.024448971" Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.804311 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-868dfdb867-qjx9n"] Oct 06 13:21:14 crc kubenswrapper[4867]: E1006 13:21:14.814998 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0efadce0-13dd-4a6d-9a54-9deabd1e8069" containerName="glance-db-sync" Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.815091 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0efadce0-13dd-4a6d-9a54-9deabd1e8069" containerName="glance-db-sync" Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.815342 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="0efadce0-13dd-4a6d-9a54-9deabd1e8069" containerName="glance-db-sync" Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.816405 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.830643 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-868dfdb867-qjx9n"] Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.943932 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-config\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.943981 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc297\" (UniqueName: \"kubernetes.io/projected/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-kube-api-access-jc297\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.944012 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-dns-swift-storage-0\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.944036 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-ovsdbserver-nb\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.944134 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-ovsdbserver-sb\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:14 crc kubenswrapper[4867]: I1006 13:21:14.944184 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-dns-svc\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.048432 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-config\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.048487 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc297\" (UniqueName: \"kubernetes.io/projected/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-kube-api-access-jc297\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.048508 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-dns-swift-storage-0\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.048532 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-ovsdbserver-nb\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.048641 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-ovsdbserver-sb\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.048670 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-dns-svc\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.053296 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-config\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.053968 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-ovsdbserver-nb\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.058242 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-dns-svc\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.058516 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-ovsdbserver-sb\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.058666 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-dns-swift-storage-0\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.078036 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc297\" (UniqueName: \"kubernetes.io/projected/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-kube-api-access-jc297\") pod \"dnsmasq-dns-868dfdb867-qjx9n\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.154614 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.691952 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-868dfdb867-qjx9n"] Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.874146 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ef38-account-create-9qrb7" Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.975953 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8trk\" (UniqueName: \"kubernetes.io/projected/2ea503d8-6da7-4349-b16c-85e3e66a9f9e-kube-api-access-n8trk\") pod \"2ea503d8-6da7-4349-b16c-85e3e66a9f9e\" (UID: \"2ea503d8-6da7-4349-b16c-85e3e66a9f9e\") " Oct 06 13:21:15 crc kubenswrapper[4867]: I1006 13:21:15.984190 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea503d8-6da7-4349-b16c-85e3e66a9f9e-kube-api-access-n8trk" (OuterVolumeSpecName: "kube-api-access-n8trk") pod "2ea503d8-6da7-4349-b16c-85e3e66a9f9e" (UID: "2ea503d8-6da7-4349-b16c-85e3e66a9f9e"). InnerVolumeSpecName "kube-api-access-n8trk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.048658 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-23c7-account-create-w2ndn" Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.078933 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8trk\" (UniqueName: \"kubernetes.io/projected/2ea503d8-6da7-4349-b16c-85e3e66a9f9e-kube-api-access-n8trk\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.145372 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-53a2-account-create-wfj5z" Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.185196 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hl57\" (UniqueName: \"kubernetes.io/projected/b3e2c571-fd42-4b41-b31e-4774988cfb31-kube-api-access-2hl57\") pod \"b3e2c571-fd42-4b41-b31e-4774988cfb31\" (UID: \"b3e2c571-fd42-4b41-b31e-4774988cfb31\") " Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.201832 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3e2c571-fd42-4b41-b31e-4774988cfb31-kube-api-access-2hl57" (OuterVolumeSpecName: "kube-api-access-2hl57") pod "b3e2c571-fd42-4b41-b31e-4774988cfb31" (UID: "b3e2c571-fd42-4b41-b31e-4774988cfb31"). InnerVolumeSpecName "kube-api-access-2hl57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.287122 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd4zq\" (UniqueName: \"kubernetes.io/projected/7cbf821b-c375-4824-95ea-d8774ffb7486-kube-api-access-cd4zq\") pod \"7cbf821b-c375-4824-95ea-d8774ffb7486\" (UID: \"7cbf821b-c375-4824-95ea-d8774ffb7486\") " Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.287946 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hl57\" (UniqueName: \"kubernetes.io/projected/b3e2c571-fd42-4b41-b31e-4774988cfb31-kube-api-access-2hl57\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.292861 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cbf821b-c375-4824-95ea-d8774ffb7486-kube-api-access-cd4zq" (OuterVolumeSpecName: "kube-api-access-cd4zq") pod "7cbf821b-c375-4824-95ea-d8774ffb7486" (UID: "7cbf821b-c375-4824-95ea-d8774ffb7486"). InnerVolumeSpecName "kube-api-access-cd4zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.389179 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd4zq\" (UniqueName: \"kubernetes.io/projected/7cbf821b-c375-4824-95ea-d8774ffb7486-kube-api-access-cd4zq\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.484009 4867 generic.go:334] "Generic (PLEG): container finished" podID="2c9f6f61-8db8-4823-b09d-4f68bf749c3c" containerID="c5198fc6c00032a552a3bc21cdd2dd945f9d68448a893cae3b6ab47cf6203c2d" exitCode=0 Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.484073 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" event={"ID":"2c9f6f61-8db8-4823-b09d-4f68bf749c3c","Type":"ContainerDied","Data":"c5198fc6c00032a552a3bc21cdd2dd945f9d68448a893cae3b6ab47cf6203c2d"} Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.484154 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" event={"ID":"2c9f6f61-8db8-4823-b09d-4f68bf749c3c","Type":"ContainerStarted","Data":"89ee8d309a0c2105f562feeca82de155805328193f281553a687c7f97b809c0e"} Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.490296 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ef38-account-create-9qrb7" Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.490466 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ef38-account-create-9qrb7" event={"ID":"2ea503d8-6da7-4349-b16c-85e3e66a9f9e","Type":"ContainerDied","Data":"3c653d01b3692dcf7f65a3cb261ff159592f2c22410bfa0842107f307bde1fa7"} Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.490576 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c653d01b3692dcf7f65a3cb261ff159592f2c22410bfa0842107f307bde1fa7" Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.494022 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-53a2-account-create-wfj5z" Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.494016 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-53a2-account-create-wfj5z" event={"ID":"7cbf821b-c375-4824-95ea-d8774ffb7486","Type":"ContainerDied","Data":"2139a5069ccc9ee42f5016994c0b18f6fc6341fadc85ebf32d70de4d51d5ca06"} Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.494142 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2139a5069ccc9ee42f5016994c0b18f6fc6341fadc85ebf32d70de4d51d5ca06" Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.496662 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-23c7-account-create-w2ndn" event={"ID":"b3e2c571-fd42-4b41-b31e-4774988cfb31","Type":"ContainerDied","Data":"6a6d1fe9c6cdf9a6940754b63613b3691f7e0993d09d00a6be1037350b9f535a"} Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.496722 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a6d1fe9c6cdf9a6940754b63613b3691f7e0993d09d00a6be1037350b9f535a" Oct 06 13:21:16 crc kubenswrapper[4867]: I1006 13:21:16.496803 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-23c7-account-create-w2ndn" Oct 06 13:21:16 crc kubenswrapper[4867]: E1006 13:21:16.536600 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c9f6f61_8db8_4823_b09d_4f68bf749c3c.slice/crio-c5198fc6c00032a552a3bc21cdd2dd945f9d68448a893cae3b6ab47cf6203c2d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c9f6f61_8db8_4823_b09d_4f68bf749c3c.slice/crio-conmon-c5198fc6c00032a552a3bc21cdd2dd945f9d68448a893cae3b6ab47cf6203c2d.scope\": RecentStats: unable to find data in memory cache]" Oct 06 13:21:17 crc kubenswrapper[4867]: I1006 13:21:17.507158 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" event={"ID":"2c9f6f61-8db8-4823-b09d-4f68bf749c3c","Type":"ContainerStarted","Data":"029d83d6726ea6ab389b72e58cd008dadbdb4d358dc4997aff618e8feadfaf35"} Oct 06 13:21:17 crc kubenswrapper[4867]: I1006 13:21:17.507829 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:17 crc kubenswrapper[4867]: I1006 13:21:17.532398 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" podStartSLOduration=3.532371812 podStartE2EDuration="3.532371812s" podCreationTimestamp="2025-10-06 13:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:21:17.52935856 +0000 UTC m=+1056.987306734" watchObservedRunningTime="2025-10-06 13:21:17.532371812 +0000 UTC m=+1056.990319946" Oct 06 13:21:18 crc kubenswrapper[4867]: I1006 13:21:18.518530 4867 generic.go:334] "Generic (PLEG): container finished" podID="3c55d65d-f40c-402b-895b-fdf4000fdf33" containerID="2699f945efc447e5da282252d22177c88eea10e9f7b730286a2b81c9e8b50eb9" exitCode=0 Oct 06 13:21:18 crc kubenswrapper[4867]: I1006 13:21:18.518591 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9rnv4" event={"ID":"3c55d65d-f40c-402b-895b-fdf4000fdf33","Type":"ContainerDied","Data":"2699f945efc447e5da282252d22177c88eea10e9f7b730286a2b81c9e8b50eb9"} Oct 06 13:21:18 crc kubenswrapper[4867]: I1006 13:21:18.521061 4867 generic.go:334] "Generic (PLEG): container finished" podID="08af8192-3b42-4ae6-85c1-e12ab46ed88a" containerID="92ba3ad6abe82e07d130480afe543510a84824dbb9d6e46c52ba0429207ec901" exitCode=0 Oct 06 13:21:18 crc kubenswrapper[4867]: I1006 13:21:18.521185 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-jfv8d" event={"ID":"08af8192-3b42-4ae6-85c1-e12ab46ed88a","Type":"ContainerDied","Data":"92ba3ad6abe82e07d130480afe543510a84824dbb9d6e46c52ba0429207ec901"} Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.035903 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9rnv4" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.041651 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.164370 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-db-sync-config-data\") pod \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.164418 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c55d65d-f40c-402b-895b-fdf4000fdf33-combined-ca-bundle\") pod \"3c55d65d-f40c-402b-895b-fdf4000fdf33\" (UID: \"3c55d65d-f40c-402b-895b-fdf4000fdf33\") " Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.164490 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c55d65d-f40c-402b-895b-fdf4000fdf33-config-data\") pod \"3c55d65d-f40c-402b-895b-fdf4000fdf33\" (UID: \"3c55d65d-f40c-402b-895b-fdf4000fdf33\") " Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.164519 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx27l\" (UniqueName: \"kubernetes.io/projected/08af8192-3b42-4ae6-85c1-e12ab46ed88a-kube-api-access-tx27l\") pod \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.164574 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-config-data\") pod \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.164622 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dw7v\" (UniqueName: \"kubernetes.io/projected/3c55d65d-f40c-402b-895b-fdf4000fdf33-kube-api-access-2dw7v\") pod \"3c55d65d-f40c-402b-895b-fdf4000fdf33\" (UID: \"3c55d65d-f40c-402b-895b-fdf4000fdf33\") " Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.164649 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-combined-ca-bundle\") pod \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\" (UID: \"08af8192-3b42-4ae6-85c1-e12ab46ed88a\") " Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.172825 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08af8192-3b42-4ae6-85c1-e12ab46ed88a-kube-api-access-tx27l" (OuterVolumeSpecName: "kube-api-access-tx27l") pod "08af8192-3b42-4ae6-85c1-e12ab46ed88a" (UID: "08af8192-3b42-4ae6-85c1-e12ab46ed88a"). InnerVolumeSpecName "kube-api-access-tx27l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.172982 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c55d65d-f40c-402b-895b-fdf4000fdf33-kube-api-access-2dw7v" (OuterVolumeSpecName: "kube-api-access-2dw7v") pod "3c55d65d-f40c-402b-895b-fdf4000fdf33" (UID: "3c55d65d-f40c-402b-895b-fdf4000fdf33"). InnerVolumeSpecName "kube-api-access-2dw7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.173137 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "08af8192-3b42-4ae6-85c1-e12ab46ed88a" (UID: "08af8192-3b42-4ae6-85c1-e12ab46ed88a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.203216 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c55d65d-f40c-402b-895b-fdf4000fdf33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c55d65d-f40c-402b-895b-fdf4000fdf33" (UID: "3c55d65d-f40c-402b-895b-fdf4000fdf33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.226960 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08af8192-3b42-4ae6-85c1-e12ab46ed88a" (UID: "08af8192-3b42-4ae6-85c1-e12ab46ed88a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.229397 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c55d65d-f40c-402b-895b-fdf4000fdf33-config-data" (OuterVolumeSpecName: "config-data") pod "3c55d65d-f40c-402b-895b-fdf4000fdf33" (UID: "3c55d65d-f40c-402b-895b-fdf4000fdf33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.234414 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-config-data" (OuterVolumeSpecName: "config-data") pod "08af8192-3b42-4ae6-85c1-e12ab46ed88a" (UID: "08af8192-3b42-4ae6-85c1-e12ab46ed88a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.267288 4867 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.267320 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c55d65d-f40c-402b-895b-fdf4000fdf33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.267331 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c55d65d-f40c-402b-895b-fdf4000fdf33-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.267340 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx27l\" (UniqueName: \"kubernetes.io/projected/08af8192-3b42-4ae6-85c1-e12ab46ed88a-kube-api-access-tx27l\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.267352 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.267362 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dw7v\" (UniqueName: \"kubernetes.io/projected/3c55d65d-f40c-402b-895b-fdf4000fdf33-kube-api-access-2dw7v\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.267371 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08af8192-3b42-4ae6-85c1-e12ab46ed88a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.545043 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9rnv4" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.544989 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9rnv4" event={"ID":"3c55d65d-f40c-402b-895b-fdf4000fdf33","Type":"ContainerDied","Data":"389650a8ffc662ba1f1ea4b9e68cdd306ef72daf2b92d5edd1f16da21289f6f8"} Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.545145 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="389650a8ffc662ba1f1ea4b9e68cdd306ef72daf2b92d5edd1f16da21289f6f8" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.547413 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-jfv8d" event={"ID":"08af8192-3b42-4ae6-85c1-e12ab46ed88a","Type":"ContainerDied","Data":"c2f64e61b411fea4645e1b7a3fa2ee4b88fe0241b647aa0c86f25e389c35588d"} Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.547465 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2f64e61b411fea4645e1b7a3fa2ee4b88fe0241b647aa0c86f25e389c35588d" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.547532 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-jfv8d" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.843027 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-868dfdb867-qjx9n"] Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.843449 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" podUID="2c9f6f61-8db8-4823-b09d-4f68bf749c3c" containerName="dnsmasq-dns" containerID="cri-o://029d83d6726ea6ab389b72e58cd008dadbdb4d358dc4997aff618e8feadfaf35" gracePeriod=10 Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.912009 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2vv4k"] Oct 06 13:21:20 crc kubenswrapper[4867]: E1006 13:21:20.912990 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c55d65d-f40c-402b-895b-fdf4000fdf33" containerName="keystone-db-sync" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.913008 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c55d65d-f40c-402b-895b-fdf4000fdf33" containerName="keystone-db-sync" Oct 06 13:21:20 crc kubenswrapper[4867]: E1006 13:21:20.913033 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e2c571-fd42-4b41-b31e-4774988cfb31" containerName="mariadb-account-create" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.913042 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e2c571-fd42-4b41-b31e-4774988cfb31" containerName="mariadb-account-create" Oct 06 13:21:20 crc kubenswrapper[4867]: E1006 13:21:20.913060 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cbf821b-c375-4824-95ea-d8774ffb7486" containerName="mariadb-account-create" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.913069 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cbf821b-c375-4824-95ea-d8774ffb7486" containerName="mariadb-account-create" Oct 06 13:21:20 crc kubenswrapper[4867]: E1006 13:21:20.913088 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08af8192-3b42-4ae6-85c1-e12ab46ed88a" containerName="watcher-db-sync" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.913095 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="08af8192-3b42-4ae6-85c1-e12ab46ed88a" containerName="watcher-db-sync" Oct 06 13:21:20 crc kubenswrapper[4867]: E1006 13:21:20.913113 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea503d8-6da7-4349-b16c-85e3e66a9f9e" containerName="mariadb-account-create" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.913121 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea503d8-6da7-4349-b16c-85e3e66a9f9e" containerName="mariadb-account-create" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.913395 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cbf821b-c375-4824-95ea-d8774ffb7486" containerName="mariadb-account-create" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.913411 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c55d65d-f40c-402b-895b-fdf4000fdf33" containerName="keystone-db-sync" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.913429 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3e2c571-fd42-4b41-b31e-4774988cfb31" containerName="mariadb-account-create" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.913441 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="08af8192-3b42-4ae6-85c1-e12ab46ed88a" containerName="watcher-db-sync" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.913453 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea503d8-6da7-4349-b16c-85e3e66a9f9e" containerName="mariadb-account-create" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.914286 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.922691 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.923052 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.923456 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sjvr2" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.926949 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.930382 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-667d8947b7-xqcx6"] Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.932709 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.982619 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-dns-svc\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.982680 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-fernet-keys\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.982761 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpf9r\" (UniqueName: \"kubernetes.io/projected/7e31671e-6528-480a-ade5-2f20ca954bad-kube-api-access-jpf9r\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.982801 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-combined-ca-bundle\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.982824 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjs64\" (UniqueName: \"kubernetes.io/projected/3824617d-49df-4851-867f-284560eeaa2c-kube-api-access-kjs64\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.982868 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-credential-keys\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.982886 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-config\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.982903 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-config-data\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.982921 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-ovsdbserver-sb\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.982939 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-ovsdbserver-nb\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.982953 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-dns-swift-storage-0\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:20 crc kubenswrapper[4867]: I1006 13:21:20.982980 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-scripts\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:20.999975 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2vv4k"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.043891 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-667d8947b7-xqcx6"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.086850 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.088613 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.088633 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpf9r\" (UniqueName: \"kubernetes.io/projected/7e31671e-6528-480a-ade5-2f20ca954bad-kube-api-access-jpf9r\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.088695 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-combined-ca-bundle\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.088723 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjs64\" (UniqueName: \"kubernetes.io/projected/3824617d-49df-4851-867f-284560eeaa2c-kube-api-access-kjs64\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.088765 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-credential-keys\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.088785 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-config\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.088810 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-config-data\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.088836 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-ovsdbserver-sb\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.088857 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-ovsdbserver-nb\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.088874 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-dns-swift-storage-0\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.088903 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-scripts\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.088926 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-dns-svc\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.088952 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-fernet-keys\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.102201 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-credential-keys\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.104091 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-fernet-keys\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.108304 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-config-data\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.110608 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.114549 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-dns-swift-storage-0\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.118367 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-ovsdbserver-sb\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.126103 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-config\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.127099 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-ovsdbserver-nb\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.127234 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-dns-svc\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.127529 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.127557 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-scripts\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.130671 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-combined-ca-bundle\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.164057 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-2dx8w" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.164681 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.164838 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.190896 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb5c8\" (UniqueName: \"kubernetes.io/projected/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-kube-api-access-hb5c8\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.190955 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.190994 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78044486-0042-4b06-ae63-0dbfe9b873ce-logs\") pod \"watcher-applier-0\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " pod="openstack/watcher-applier-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.191092 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78044486-0042-4b06-ae63-0dbfe9b873ce-config-data\") pod \"watcher-applier-0\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " pod="openstack/watcher-applier-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.191118 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv522\" (UniqueName: \"kubernetes.io/projected/78044486-0042-4b06-ae63-0dbfe9b873ce-kube-api-access-vv522\") pod \"watcher-applier-0\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " pod="openstack/watcher-applier-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.191138 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-logs\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.191155 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78044486-0042-4b06-ae63-0dbfe9b873ce-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " pod="openstack/watcher-applier-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.191193 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.191212 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.198782 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.268881 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpf9r\" (UniqueName: \"kubernetes.io/projected/7e31671e-6528-480a-ade5-2f20ca954bad-kube-api-access-jpf9r\") pod \"dnsmasq-dns-667d8947b7-xqcx6\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.269031 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjs64\" (UniqueName: \"kubernetes.io/projected/3824617d-49df-4851-867f-284560eeaa2c-kube-api-access-kjs64\") pod \"keystone-bootstrap-2vv4k\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.299047 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.299330 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78044486-0042-4b06-ae63-0dbfe9b873ce-logs\") pod \"watcher-applier-0\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " pod="openstack/watcher-applier-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.299481 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78044486-0042-4b06-ae63-0dbfe9b873ce-config-data\") pod \"watcher-applier-0\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " pod="openstack/watcher-applier-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.299565 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv522\" (UniqueName: \"kubernetes.io/projected/78044486-0042-4b06-ae63-0dbfe9b873ce-kube-api-access-vv522\") pod \"watcher-applier-0\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " pod="openstack/watcher-applier-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.299630 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-logs\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.299728 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78044486-0042-4b06-ae63-0dbfe9b873ce-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " pod="openstack/watcher-applier-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.299831 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.299916 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.300009 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb5c8\" (UniqueName: \"kubernetes.io/projected/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-kube-api-access-hb5c8\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.306390 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-logs\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.306540 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.307083 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78044486-0042-4b06-ae63-0dbfe9b873ce-logs\") pod \"watcher-applier-0\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " pod="openstack/watcher-applier-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.318566 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.319384 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78044486-0042-4b06-ae63-0dbfe9b873ce-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " pod="openstack/watcher-applier-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.319611 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.350332 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7d55bc49dc-hzd62"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.352391 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.400256 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.403483 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.404053 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78044486-0042-4b06-ae63-0dbfe9b873ce-config-data\") pod \"watcher-applier-0\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " pod="openstack/watcher-applier-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.405430 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb5c8\" (UniqueName: \"kubernetes.io/projected/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-kube-api-access-hb5c8\") pod \"watcher-decision-engine-0\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.408397 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-q6tgv" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.421827 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.425584 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv522\" (UniqueName: \"kubernetes.io/projected/78044486-0042-4b06-ae63-0dbfe9b873ce-kube-api-access-vv522\") pod \"watcher-applier-0\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " pod="openstack/watcher-applier-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.425982 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.434825 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.523655 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d55bc49dc-hzd62"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.527213 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1f9710-07da-4226-befb-73474a496cae-config-data\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.527314 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef1f9710-07da-4226-befb-73474a496cae-horizon-secret-key\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.527376 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef1f9710-07da-4226-befb-73474a496cae-scripts\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.527711 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7hgh\" (UniqueName: \"kubernetes.io/projected/ef1f9710-07da-4226-befb-73474a496cae-kube-api-access-w7hgh\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.527814 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1f9710-07da-4226-befb-73474a496cae-logs\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.528746 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.532074 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-js7h4"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.534911 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-js7h4" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.604065 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.604327 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-js7h4"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.611744 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-c74pn" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.611996 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.612827 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.620355 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.642337 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.645641 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.657044 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.662872 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7hgh\" (UniqueName: \"kubernetes.io/projected/ef1f9710-07da-4226-befb-73474a496cae-kube-api-access-w7hgh\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.662937 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1f9710-07da-4226-befb-73474a496cae-logs\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.663001 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kckg\" (UniqueName: \"kubernetes.io/projected/82b74021-1aea-4ef5-981a-2b0fc63ec06b-kube-api-access-2kckg\") pod \"neutron-db-sync-js7h4\" (UID: \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\") " pod="openstack/neutron-db-sync-js7h4" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.663045 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1f9710-07da-4226-befb-73474a496cae-config-data\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.663070 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef1f9710-07da-4226-befb-73474a496cae-horizon-secret-key\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.663091 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef1f9710-07da-4226-befb-73474a496cae-scripts\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.663118 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82b74021-1aea-4ef5-981a-2b0fc63ec06b-config\") pod \"neutron-db-sync-js7h4\" (UID: \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\") " pod="openstack/neutron-db-sync-js7h4" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.663163 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b74021-1aea-4ef5-981a-2b0fc63ec06b-combined-ca-bundle\") pod \"neutron-db-sync-js7h4\" (UID: \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\") " pod="openstack/neutron-db-sync-js7h4" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.663788 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1f9710-07da-4226-befb-73474a496cae-logs\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.664931 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1f9710-07da-4226-befb-73474a496cae-config-data\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.667143 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef1f9710-07da-4226-befb-73474a496cae-scripts\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.675420 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-d8hp7"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.676937 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.691637 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.701563 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.707922 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef1f9710-07da-4226-befb-73474a496cae-horizon-secret-key\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.715477 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-62mxc" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.715744 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.715888 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.731514 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.744766 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.764425 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-run-httpd\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.764499 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-config-data\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.764540 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-scripts\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.764566 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-config-data\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.764592 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57cch\" (UniqueName: \"kubernetes.io/projected/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-kube-api-access-57cch\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.764636 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kckg\" (UniqueName: \"kubernetes.io/projected/82b74021-1aea-4ef5-981a-2b0fc63ec06b-kube-api-access-2kckg\") pod \"neutron-db-sync-js7h4\" (UID: \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\") " pod="openstack/neutron-db-sync-js7h4" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.764666 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-combined-ca-bundle\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.764685 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjn7w\" (UniqueName: \"kubernetes.io/projected/4f59ab79-d706-4e1f-9361-6efea6b85568-kube-api-access-mjn7w\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.764726 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.764743 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.770458 4867 generic.go:334] "Generic (PLEG): container finished" podID="2c9f6f61-8db8-4823-b09d-4f68bf749c3c" containerID="029d83d6726ea6ab389b72e58cd008dadbdb4d358dc4997aff618e8feadfaf35" exitCode=0 Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.770491 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" event={"ID":"2c9f6f61-8db8-4823-b09d-4f68bf749c3c","Type":"ContainerDied","Data":"029d83d6726ea6ab389b72e58cd008dadbdb4d358dc4997aff618e8feadfaf35"} Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.775624 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82b74021-1aea-4ef5-981a-2b0fc63ec06b-config\") pod \"neutron-db-sync-js7h4\" (UID: \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\") " pod="openstack/neutron-db-sync-js7h4" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.775675 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-scripts\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.775710 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-log-httpd\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.775868 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f59ab79-d706-4e1f-9361-6efea6b85568-logs\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.775915 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b74021-1aea-4ef5-981a-2b0fc63ec06b-combined-ca-bundle\") pod \"neutron-db-sync-js7h4\" (UID: \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\") " pod="openstack/neutron-db-sync-js7h4" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.848338 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.867462 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.874001 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7hgh\" (UniqueName: \"kubernetes.io/projected/ef1f9710-07da-4226-befb-73474a496cae-kube-api-access-w7hgh\") pod \"horizon-7d55bc49dc-hzd62\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.876315 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/82b74021-1aea-4ef5-981a-2b0fc63ec06b-config\") pod \"neutron-db-sync-js7h4\" (UID: \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\") " pod="openstack/neutron-db-sync-js7h4" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877512 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-run-httpd\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877554 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c52253fc-5075-43f3-81e3-45cdbc49fa52-logs\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877581 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-config-data\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877605 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-config-data\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877622 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-scripts\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877653 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-config-data\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877675 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57cch\" (UniqueName: \"kubernetes.io/projected/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-kube-api-access-57cch\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877694 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877739 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prsnz\" (UniqueName: \"kubernetes.io/projected/c52253fc-5075-43f3-81e3-45cdbc49fa52-kube-api-access-prsnz\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877759 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-combined-ca-bundle\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877780 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjn7w\" (UniqueName: \"kubernetes.io/projected/4f59ab79-d706-4e1f-9361-6efea6b85568-kube-api-access-mjn7w\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877823 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877866 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877888 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-scripts\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877916 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-log-httpd\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877951 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.877976 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f59ab79-d706-4e1f-9361-6efea6b85568-logs\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.878438 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f59ab79-d706-4e1f-9361-6efea6b85568-logs\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.878885 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-run-httpd\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.879277 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b74021-1aea-4ef5-981a-2b0fc63ec06b-combined-ca-bundle\") pod \"neutron-db-sync-js7h4\" (UID: \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\") " pod="openstack/neutron-db-sync-js7h4" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.888349 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-log-httpd\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.890543 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kckg\" (UniqueName: \"kubernetes.io/projected/82b74021-1aea-4ef5-981a-2b0fc63ec06b-kube-api-access-2kckg\") pod \"neutron-db-sync-js7h4\" (UID: \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\") " pod="openstack/neutron-db-sync-js7h4" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.916462 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-d8hp7"] Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.919142 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-scripts\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.920243 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-config-data\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.921972 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-scripts\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.922039 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-combined-ca-bundle\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.922883 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.936127 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.957315 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-config-data\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.993514 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjn7w\" (UniqueName: \"kubernetes.io/projected/4f59ab79-d706-4e1f-9361-6efea6b85568-kube-api-access-mjn7w\") pod \"placement-db-sync-d8hp7\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:21 crc kubenswrapper[4867]: I1006 13:21:21.993605 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58966f7699-hjfh4"] Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.001009 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57cch\" (UniqueName: \"kubernetes.io/projected/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-kube-api-access-57cch\") pod \"ceilometer-0\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " pod="openstack/ceilometer-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.020565 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.026194 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c52253fc-5075-43f3-81e3-45cdbc49fa52-logs\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.026316 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-config-data\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.026405 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.026477 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prsnz\" (UniqueName: \"kubernetes.io/projected/c52253fc-5075-43f3-81e3-45cdbc49fa52-kube-api-access-prsnz\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.026645 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.035072 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-667d8947b7-xqcx6"] Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.049055 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c52253fc-5075-43f3-81e3-45cdbc49fa52-logs\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.049958 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.083619 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-config-data\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.083777 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.083941 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.120523 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58966f7699-hjfh4"] Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.123361 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prsnz\" (UniqueName: \"kubernetes.io/projected/c52253fc-5075-43f3-81e3-45cdbc49fa52-kube-api-access-prsnz\") pod \"watcher-api-0\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " pod="openstack/watcher-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.128663 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th9gw\" (UniqueName: \"kubernetes.io/projected/2a642d87-db23-4d12-90ac-ebdfbfe00996-kube-api-access-th9gw\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.128736 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a642d87-db23-4d12-90ac-ebdfbfe00996-scripts\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.128788 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a642d87-db23-4d12-90ac-ebdfbfe00996-logs\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.128817 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a642d87-db23-4d12-90ac-ebdfbfe00996-config-data\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.128836 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a642d87-db23-4d12-90ac-ebdfbfe00996-horizon-secret-key\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.162021 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-js7h4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.189299 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.220439 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d8hp7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.231523 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5df64d6755-5gzc7"] Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.234355 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.237233 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a642d87-db23-4d12-90ac-ebdfbfe00996-scripts\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.237314 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a642d87-db23-4d12-90ac-ebdfbfe00996-logs\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.237351 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a642d87-db23-4d12-90ac-ebdfbfe00996-config-data\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.237369 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a642d87-db23-4d12-90ac-ebdfbfe00996-horizon-secret-key\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.237446 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th9gw\" (UniqueName: \"kubernetes.io/projected/2a642d87-db23-4d12-90ac-ebdfbfe00996-kube-api-access-th9gw\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.241752 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a642d87-db23-4d12-90ac-ebdfbfe00996-scripts\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.241773 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a642d87-db23-4d12-90ac-ebdfbfe00996-config-data\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.245586 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a642d87-db23-4d12-90ac-ebdfbfe00996-logs\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.262838 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.269634 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th9gw\" (UniqueName: \"kubernetes.io/projected/2a642d87-db23-4d12-90ac-ebdfbfe00996-kube-api-access-th9gw\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.290059 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a642d87-db23-4d12-90ac-ebdfbfe00996-horizon-secret-key\") pod \"horizon-58966f7699-hjfh4\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.313200 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.344651 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-dns-svc\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.344762 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-config\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.344853 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-dns-swift-storage-0\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.344934 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-ovsdbserver-nb\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.345030 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm624\" (UniqueName: \"kubernetes.io/projected/60344bb3-1d96-49fc-bf7b-2fe9452160d3-kube-api-access-vm624\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.345060 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-ovsdbserver-sb\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.372455 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.376381 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.385764 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.385792 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.385991 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vgsz4" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.393306 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.406487 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.449711 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.449774 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-dns-swift-storage-0\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.449798 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.449842 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.449871 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.449915 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-ovsdbserver-nb\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.449976 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.450002 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.450035 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-logs\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.450054 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm624\" (UniqueName: \"kubernetes.io/projected/60344bb3-1d96-49fc-bf7b-2fe9452160d3-kube-api-access-vm624\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.450081 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-ovsdbserver-sb\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.450123 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-dns-svc\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.450149 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fwf2\" (UniqueName: \"kubernetes.io/projected/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-kube-api-access-4fwf2\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.450168 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-config\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.453311 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-ovsdbserver-nb\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.453921 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-dns-swift-storage-0\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.454083 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-ovsdbserver-sb\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.454542 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-dns-svc\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.460510 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-config\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.471574 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5df64d6755-5gzc7"] Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.478022 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm624\" (UniqueName: \"kubernetes.io/projected/60344bb3-1d96-49fc-bf7b-2fe9452160d3-kube-api-access-vm624\") pod \"dnsmasq-dns-5df64d6755-5gzc7\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.504483 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.506988 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.510571 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.510834 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.524318 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.552993 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553069 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553098 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-scripts\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553122 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45xbn\" (UniqueName: \"kubernetes.io/projected/d854b81e-d1da-4eb4-9291-0a98cf04d652-kube-api-access-45xbn\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553162 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-logs\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553215 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fwf2\" (UniqueName: \"kubernetes.io/projected/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-kube-api-access-4fwf2\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553236 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553257 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d854b81e-d1da-4eb4-9291-0a98cf04d652-logs\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553348 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553368 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d854b81e-d1da-4eb4-9291-0a98cf04d652-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553388 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553417 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553433 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-config-data\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553449 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553477 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.553514 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.554917 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-logs\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.555689 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.556047 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.563324 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.571676 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-m8h8z"] Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.573389 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.578030 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-r4jxx" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.580827 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.600571 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fwf2\" (UniqueName: \"kubernetes.io/projected/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-kube-api-access-4fwf2\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.623566 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.624577 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.636386 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.643496 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.674888 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.684720 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d854b81e-d1da-4eb4-9291-0a98cf04d652-logs\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.685036 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d854b81e-d1da-4eb4-9291-0a98cf04d652-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.685737 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d854b81e-d1da-4eb4-9291-0a98cf04d652-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.686845 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.686877 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-config-data\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.687000 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.687052 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-scripts\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.687074 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45xbn\" (UniqueName: \"kubernetes.io/projected/d854b81e-d1da-4eb4-9291-0a98cf04d652-kube-api-access-45xbn\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.687201 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.689607 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.718850 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d854b81e-d1da-4eb4-9291-0a98cf04d652-logs\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.732672 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-m8h8z"] Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.737294 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45xbn\" (UniqueName: \"kubernetes.io/projected/d854b81e-d1da-4eb4-9291-0a98cf04d652-kube-api-access-45xbn\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.741409 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.743738 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.748525 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.755737 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.820728 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-config-data\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.822517 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-scripts\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.822540 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-4vzlg"] Oct 06 13:21:22 crc kubenswrapper[4867]: E1006 13:21:22.823059 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9f6f61-8db8-4823-b09d-4f68bf749c3c" containerName="dnsmasq-dns" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.823073 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9f6f61-8db8-4823-b09d-4f68bf749c3c" containerName="dnsmasq-dns" Oct 06 13:21:22 crc kubenswrapper[4867]: E1006 13:21:22.823104 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9f6f61-8db8-4823-b09d-4f68bf749c3c" containerName="init" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.823110 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9f6f61-8db8-4823-b09d-4f68bf749c3c" containerName="init" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.823334 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c9f6f61-8db8-4823-b09d-4f68bf749c3c" containerName="dnsmasq-dns" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.823985 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4vzlg" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.825075 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc297\" (UniqueName: \"kubernetes.io/projected/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-kube-api-access-jc297\") pod \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.825210 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-ovsdbserver-nb\") pod \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.833100 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-s72xt" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.834093 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-dns-swift-storage-0\") pod \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.834120 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-config\") pod \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.834189 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-ovsdbserver-sb\") pod \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.834214 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-dns-svc\") pod \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.838948 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69ef6013-a982-45c6-8fc8-46c11fead4a7-etc-machine-id\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.839025 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-db-sync-config-data\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.839077 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-combined-ca-bundle\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.839115 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-config-data\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.839348 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-scripts\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.839390 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s5k9\" (UniqueName: \"kubernetes.io/projected/69ef6013-a982-45c6-8fc8-46c11fead4a7-kube-api-access-2s5k9\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.842940 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.864753 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-kube-api-access-jc297" (OuterVolumeSpecName: "kube-api-access-jc297") pod "2c9f6f61-8db8-4823-b09d-4f68bf749c3c" (UID: "2c9f6f61-8db8-4823-b09d-4f68bf749c3c"). InnerVolumeSpecName "kube-api-access-jc297". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.944089 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-4vzlg"] Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.946883 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69ef6013-a982-45c6-8fc8-46c11fead4a7-etc-machine-id\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.946924 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-db-sync-config-data\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.946950 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5q2m\" (UniqueName: \"kubernetes.io/projected/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-kube-api-access-p5q2m\") pod \"barbican-db-sync-4vzlg\" (UID: \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\") " pod="openstack/barbican-db-sync-4vzlg" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.946977 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-combined-ca-bundle\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.947001 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-config-data\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.947081 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-db-sync-config-data\") pod \"barbican-db-sync-4vzlg\" (UID: \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\") " pod="openstack/barbican-db-sync-4vzlg" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.947102 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-scripts\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.947127 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s5k9\" (UniqueName: \"kubernetes.io/projected/69ef6013-a982-45c6-8fc8-46c11fead4a7-kube-api-access-2s5k9\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.947164 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-combined-ca-bundle\") pod \"barbican-db-sync-4vzlg\" (UID: \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\") " pod="openstack/barbican-db-sync-4vzlg" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.947232 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc297\" (UniqueName: \"kubernetes.io/projected/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-kube-api-access-jc297\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.947348 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69ef6013-a982-45c6-8fc8-46c11fead4a7-etc-machine-id\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.990200 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-scripts\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.990967 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" event={"ID":"2c9f6f61-8db8-4823-b09d-4f68bf749c3c","Type":"ContainerDied","Data":"89ee8d309a0c2105f562feeca82de155805328193f281553a687c7f97b809c0e"} Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.991023 4867 scope.go:117] "RemoveContainer" containerID="029d83d6726ea6ab389b72e58cd008dadbdb4d358dc4997aff618e8feadfaf35" Oct 06 13:21:22 crc kubenswrapper[4867]: I1006 13:21:22.991165 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-868dfdb867-qjx9n" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.002032 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-config-data\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.004617 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-combined-ca-bundle\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.006356 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-db-sync-config-data\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.043690 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.048728 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-db-sync-config-data\") pod \"barbican-db-sync-4vzlg\" (UID: \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\") " pod="openstack/barbican-db-sync-4vzlg" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.048841 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-combined-ca-bundle\") pod \"barbican-db-sync-4vzlg\" (UID: \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\") " pod="openstack/barbican-db-sync-4vzlg" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.048910 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5q2m\" (UniqueName: \"kubernetes.io/projected/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-kube-api-access-p5q2m\") pod \"barbican-db-sync-4vzlg\" (UID: \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\") " pod="openstack/barbican-db-sync-4vzlg" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.095572 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s5k9\" (UniqueName: \"kubernetes.io/projected/69ef6013-a982-45c6-8fc8-46c11fead4a7-kube-api-access-2s5k9\") pod \"cinder-db-sync-m8h8z\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.098434 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-combined-ca-bundle\") pod \"barbican-db-sync-4vzlg\" (UID: \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\") " pod="openstack/barbican-db-sync-4vzlg" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.099190 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-db-sync-config-data\") pod \"barbican-db-sync-4vzlg\" (UID: \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\") " pod="openstack/barbican-db-sync-4vzlg" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.117228 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2vv4k"] Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.165368 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " pod="openstack/glance-default-external-api-0" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.175412 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5q2m\" (UniqueName: \"kubernetes.io/projected/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-kube-api-access-p5q2m\") pod \"barbican-db-sync-4vzlg\" (UID: \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\") " pod="openstack/barbican-db-sync-4vzlg" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.203092 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4vzlg" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.207418 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.274244 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c9f6f61-8db8-4823-b09d-4f68bf749c3c" (UID: "2c9f6f61-8db8-4823-b09d-4f68bf749c3c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.274682 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-dns-svc\") pod \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\" (UID: \"2c9f6f61-8db8-4823-b09d-4f68bf749c3c\") " Oct 06 13:21:23 crc kubenswrapper[4867]: W1006 13:21:23.275398 4867 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2c9f6f61-8db8-4823-b09d-4f68bf749c3c/volumes/kubernetes.io~configmap/dns-svc Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.275433 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c9f6f61-8db8-4823-b09d-4f68bf749c3c" (UID: "2c9f6f61-8db8-4823-b09d-4f68bf749c3c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.280946 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c9f6f61-8db8-4823-b09d-4f68bf749c3c" (UID: "2c9f6f61-8db8-4823-b09d-4f68bf749c3c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.301188 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.363784 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2c9f6f61-8db8-4823-b09d-4f68bf749c3c" (UID: "2c9f6f61-8db8-4823-b09d-4f68bf749c3c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.372379 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-config" (OuterVolumeSpecName: "config") pod "2c9f6f61-8db8-4823-b09d-4f68bf749c3c" (UID: "2c9f6f61-8db8-4823-b09d-4f68bf749c3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.372919 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c9f6f61-8db8-4823-b09d-4f68bf749c3c" (UID: "2c9f6f61-8db8-4823-b09d-4f68bf749c3c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.376518 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-667d8947b7-xqcx6"] Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.376570 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.379046 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.379091 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.379101 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.379111 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.379119 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c9f6f61-8db8-4823-b09d-4f68bf749c3c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.406729 4867 scope.go:117] "RemoveContainer" containerID="c5198fc6c00032a552a3bc21cdd2dd945f9d68448a893cae3b6ab47cf6203c2d" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.465899 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.625965 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d55bc49dc-hzd62"] Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.660963 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-js7h4"] Oct 06 13:21:23 crc kubenswrapper[4867]: W1006 13:21:23.691838 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef1f9710_07da_4226_befb_73474a496cae.slice/crio-a3dff45e328e489cd5866aa74480bf46905b0c7ee79f693480e6ca409986c667 WatchSource:0}: Error finding container a3dff45e328e489cd5866aa74480bf46905b0c7ee79f693480e6ca409986c667: Status 404 returned error can't find the container with id a3dff45e328e489cd5866aa74480bf46905b0c7ee79f693480e6ca409986c667 Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.773929 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-868dfdb867-qjx9n"] Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.787948 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-868dfdb867-qjx9n"] Oct 06 13:21:23 crc kubenswrapper[4867]: I1006 13:21:23.839163 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-d8hp7"] Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.016156 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-js7h4" event={"ID":"82b74021-1aea-4ef5-981a-2b0fc63ec06b","Type":"ContainerStarted","Data":"b2c3fe939ce96160b6457fec8d475c274f8e14ea45e211d375cdd1d71d53fb66"} Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.021804 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d8hp7" event={"ID":"4f59ab79-d706-4e1f-9361-6efea6b85568","Type":"ContainerStarted","Data":"8f30944820092d0b9d03142e9535f82eb095ea1e0030bfd3c486f5a60b9a66d3"} Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.028566 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d55bc49dc-hzd62" event={"ID":"ef1f9710-07da-4226-befb-73474a496cae","Type":"ContainerStarted","Data":"a3dff45e328e489cd5866aa74480bf46905b0c7ee79f693480e6ca409986c667"} Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.031777 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3","Type":"ContainerStarted","Data":"b1a78f23360ebcd4d067b2c22ea82d3d1b72a97a2a34eb3d8776e13c069d394e"} Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.034547 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"78044486-0042-4b06-ae63-0dbfe9b873ce","Type":"ContainerStarted","Data":"7186781a1b18e864146c9bae88f3994b370341a57ff92fdf9d2a742d3ef850b3"} Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.041104 4867 generic.go:334] "Generic (PLEG): container finished" podID="7e31671e-6528-480a-ade5-2f20ca954bad" containerID="02119139ab396dd771b4ebebd00e1b12b0b0fc639224890e54d6c5945f617c5b" exitCode=0 Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.041178 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" event={"ID":"7e31671e-6528-480a-ade5-2f20ca954bad","Type":"ContainerDied","Data":"02119139ab396dd771b4ebebd00e1b12b0b0fc639224890e54d6c5945f617c5b"} Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.041202 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" event={"ID":"7e31671e-6528-480a-ade5-2f20ca954bad","Type":"ContainerStarted","Data":"36c129933bbddf95aa72950309fd9df909202beb3158f818477e394087d3d22c"} Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.121194 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2vv4k" event={"ID":"3824617d-49df-4851-867f-284560eeaa2c","Type":"ContainerStarted","Data":"232360b71bbd8600afe387f269e707b6841ddf8870e8cdea62fed2b11ab1c732"} Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.121275 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2vv4k" event={"ID":"3824617d-49df-4851-867f-284560eeaa2c","Type":"ContainerStarted","Data":"08126750aff912ab741328196c7acd2da237ef4ef917d5aed419fe6156060c91"} Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.152341 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2vv4k" podStartSLOduration=4.152311739 podStartE2EDuration="4.152311739s" podCreationTimestamp="2025-10-06 13:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:21:24.148612228 +0000 UTC m=+1063.606560372" watchObservedRunningTime="2025-10-06 13:21:24.152311739 +0000 UTC m=+1063.610259883" Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.258015 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.274327 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58966f7699-hjfh4"] Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.297315 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.703335 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.720988 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.747651 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58966f7699-hjfh4"] Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.820356 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7544c988fc-272d4"] Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.822240 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.844092 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7544c988fc-272d4"] Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.905497 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.955040 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8819e30-41b5-4fcd-8158-b6b5c178aea9-config-data\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.955121 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8819e30-41b5-4fcd-8158-b6b5c178aea9-scripts\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.955163 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8819e30-41b5-4fcd-8158-b6b5c178aea9-logs\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.955193 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a8819e30-41b5-4fcd-8158-b6b5c178aea9-horizon-secret-key\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.955224 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkxpq\" (UniqueName: \"kubernetes.io/projected/a8819e30-41b5-4fcd-8158-b6b5c178aea9-kube-api-access-rkxpq\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:24 crc kubenswrapper[4867]: I1006 13:21:24.981161 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.030418 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-4vzlg"] Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.063086 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8819e30-41b5-4fcd-8158-b6b5c178aea9-scripts\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.069301 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8819e30-41b5-4fcd-8158-b6b5c178aea9-logs\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.069085 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8819e30-41b5-4fcd-8158-b6b5c178aea9-scripts\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.069431 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a8819e30-41b5-4fcd-8158-b6b5c178aea9-horizon-secret-key\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.069666 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkxpq\" (UniqueName: \"kubernetes.io/projected/a8819e30-41b5-4fcd-8158-b6b5c178aea9-kube-api-access-rkxpq\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.069931 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8819e30-41b5-4fcd-8158-b6b5c178aea9-config-data\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.071094 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8819e30-41b5-4fcd-8158-b6b5c178aea9-logs\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.071309 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8819e30-41b5-4fcd-8158-b6b5c178aea9-config-data\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:25 crc kubenswrapper[4867]: W1006 13:21:25.071362 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60344bb3_1d96_49fc_bf7b_2fe9452160d3.slice/crio-d6aafe20a37b6b66ce19777b5dde436fc1df9a858668e8c74c96e310b24a8a55 WatchSource:0}: Error finding container d6aafe20a37b6b66ce19777b5dde436fc1df9a858668e8c74c96e310b24a8a55: Status 404 returned error can't find the container with id d6aafe20a37b6b66ce19777b5dde436fc1df9a858668e8c74c96e310b24a8a55 Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.079334 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a8819e30-41b5-4fcd-8158-b6b5c178aea9-horizon-secret-key\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.115371 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5df64d6755-5gzc7"] Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.119067 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkxpq\" (UniqueName: \"kubernetes.io/projected/a8819e30-41b5-4fcd-8158-b6b5c178aea9-kube-api-access-rkxpq\") pod \"horizon-7544c988fc-272d4\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.192615 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.203773 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.212738 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-m8h8z"] Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.324687 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c9f6f61-8db8-4823-b09d-4f68bf749c3c" path="/var/lib/kubelet/pods/2c9f6f61-8db8-4823-b09d-4f68bf749c3c/volumes" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.325710 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" event={"ID":"60344bb3-1d96-49fc-bf7b-2fe9452160d3","Type":"ContainerStarted","Data":"d6aafe20a37b6b66ce19777b5dde436fc1df9a858668e8c74c96e310b24a8a55"} Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.325741 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01","Type":"ContainerStarted","Data":"a67fffe53a2617487eb3ce45a041622c565865ef52426b3440a27e771957b294"} Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.325752 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4vzlg" event={"ID":"1ab90007-6383-4ff2-97cc-edb5d7d13d1e","Type":"ContainerStarted","Data":"2e4bd65fe944303c7e779fd33caed054fbcb28d9c547fbd0835990d3419da410"} Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.325763 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58966f7699-hjfh4" event={"ID":"2a642d87-db23-4d12-90ac-ebdfbfe00996","Type":"ContainerStarted","Data":"6aa36ef7116bc1e26c7658ff412270e82977faa327dea3d4d5fa3d36455bb37a"} Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.325773 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c52253fc-5075-43f3-81e3-45cdbc49fa52","Type":"ContainerStarted","Data":"d6e5bc238916ee18aa99888c7b12bb77fc2dcf388efce2a05c818f4457bfaf40"} Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.325783 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c52253fc-5075-43f3-81e3-45cdbc49fa52","Type":"ContainerStarted","Data":"d93908f0f7882f100f121ede62c286ea1c2a7aa0cd146e3a31786361a696fc18"} Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.325795 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-js7h4" event={"ID":"82b74021-1aea-4ef5-981a-2b0fc63ec06b","Type":"ContainerStarted","Data":"adf04b731cde07ae684ec0a4ffb7eed577de8c4d9b3e127e57dabb10f30c79d5"} Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.343157 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-js7h4" podStartSLOduration=4.34313794 podStartE2EDuration="4.34313794s" podCreationTimestamp="2025-10-06 13:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:21:25.338653748 +0000 UTC m=+1064.796601912" watchObservedRunningTime="2025-10-06 13:21:25.34313794 +0000 UTC m=+1064.801086084" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.362604 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.386317 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.482850 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-dns-svc\") pod \"7e31671e-6528-480a-ade5-2f20ca954bad\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.482996 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-ovsdbserver-nb\") pod \"7e31671e-6528-480a-ade5-2f20ca954bad\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.483052 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpf9r\" (UniqueName: \"kubernetes.io/projected/7e31671e-6528-480a-ade5-2f20ca954bad-kube-api-access-jpf9r\") pod \"7e31671e-6528-480a-ade5-2f20ca954bad\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.484180 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-config\") pod \"7e31671e-6528-480a-ade5-2f20ca954bad\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.484227 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-dns-swift-storage-0\") pod \"7e31671e-6528-480a-ade5-2f20ca954bad\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.484306 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-ovsdbserver-sb\") pod \"7e31671e-6528-480a-ade5-2f20ca954bad\" (UID: \"7e31671e-6528-480a-ade5-2f20ca954bad\") " Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.515230 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e31671e-6528-480a-ade5-2f20ca954bad-kube-api-access-jpf9r" (OuterVolumeSpecName: "kube-api-access-jpf9r") pod "7e31671e-6528-480a-ade5-2f20ca954bad" (UID: "7e31671e-6528-480a-ade5-2f20ca954bad"). InnerVolumeSpecName "kube-api-access-jpf9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.536140 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e31671e-6528-480a-ade5-2f20ca954bad" (UID: "7e31671e-6528-480a-ade5-2f20ca954bad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.542993 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e31671e-6528-480a-ade5-2f20ca954bad" (UID: "7e31671e-6528-480a-ade5-2f20ca954bad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.547838 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-config" (OuterVolumeSpecName: "config") pod "7e31671e-6528-480a-ade5-2f20ca954bad" (UID: "7e31671e-6528-480a-ade5-2f20ca954bad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.547980 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7e31671e-6528-480a-ade5-2f20ca954bad" (UID: "7e31671e-6528-480a-ade5-2f20ca954bad"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.559784 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e31671e-6528-480a-ade5-2f20ca954bad" (UID: "7e31671e-6528-480a-ade5-2f20ca954bad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.596053 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.596092 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpf9r\" (UniqueName: \"kubernetes.io/projected/7e31671e-6528-480a-ade5-2f20ca954bad-kube-api-access-jpf9r\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.596105 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.596115 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.596126 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:25 crc kubenswrapper[4867]: I1006 13:21:25.596135 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e31671e-6528-480a-ade5-2f20ca954bad-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.015457 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7544c988fc-272d4"] Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.360072 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d854b81e-d1da-4eb4-9291-0a98cf04d652","Type":"ContainerStarted","Data":"ac66e97d55a899d9e3657c35fb81f0c9746684722765ce59553220a3abf44b51"} Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.364911 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c52253fc-5075-43f3-81e3-45cdbc49fa52","Type":"ContainerStarted","Data":"d107a15fd03c53fd4f1ecab07a06e4c731affb3d8e38d88a40b5e1a438e91145"} Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.365137 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api-log" containerID="cri-o://d6e5bc238916ee18aa99888c7b12bb77fc2dcf388efce2a05c818f4457bfaf40" gracePeriod=30 Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.365144 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api" containerID="cri-o://d107a15fd03c53fd4f1ecab07a06e4c731affb3d8e38d88a40b5e1a438e91145" gracePeriod=30 Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.366627 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.383396 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": EOF" Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.389780 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m8h8z" event={"ID":"69ef6013-a982-45c6-8fc8-46c11fead4a7","Type":"ContainerStarted","Data":"bc723ac44659405b7e03c95fb1ff5694a7efb5e8a52d3ea3db75d587f0edf58a"} Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.402351 4867 generic.go:334] "Generic (PLEG): container finished" podID="60344bb3-1d96-49fc-bf7b-2fe9452160d3" containerID="9dc386b8a5ff32df5970821557cf49b137d0ceb352127e977a5d961a74f562c2" exitCode=0 Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.402744 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" event={"ID":"60344bb3-1d96-49fc-bf7b-2fe9452160d3","Type":"ContainerDied","Data":"9dc386b8a5ff32df5970821557cf49b137d0ceb352127e977a5d961a74f562c2"} Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.402430 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=5.402403524 podStartE2EDuration="5.402403524s" podCreationTimestamp="2025-10-06 13:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:21:26.39273661 +0000 UTC m=+1065.850684764" watchObservedRunningTime="2025-10-06 13:21:26.402403524 +0000 UTC m=+1065.860351668" Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.409719 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c6e4e3d-0f22-4a7d-9396-0ac732d65496","Type":"ContainerStarted","Data":"9a3ca6b45fd5802cd49c8d7039dc703071417a44f664a074dad0fadfefce27d1"} Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.422544 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.422640 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667d8947b7-xqcx6" event={"ID":"7e31671e-6528-480a-ade5-2f20ca954bad","Type":"ContainerDied","Data":"36c129933bbddf95aa72950309fd9df909202beb3158f818477e394087d3d22c"} Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.422940 4867 scope.go:117] "RemoveContainer" containerID="02119139ab396dd771b4ebebd00e1b12b0b0fc639224890e54d6c5945f617c5b" Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.530165 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-667d8947b7-xqcx6"] Oct 06 13:21:26 crc kubenswrapper[4867]: I1006 13:21:26.558566 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-667d8947b7-xqcx6"] Oct 06 13:21:27 crc kubenswrapper[4867]: I1006 13:21:27.253895 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e31671e-6528-480a-ade5-2f20ca954bad" path="/var/lib/kubelet/pods/7e31671e-6528-480a-ade5-2f20ca954bad/volumes" Oct 06 13:21:27 crc kubenswrapper[4867]: I1006 13:21:27.264447 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 06 13:21:27 crc kubenswrapper[4867]: I1006 13:21:27.435691 4867 generic.go:334] "Generic (PLEG): container finished" podID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerID="d6e5bc238916ee18aa99888c7b12bb77fc2dcf388efce2a05c818f4457bfaf40" exitCode=143 Oct 06 13:21:27 crc kubenswrapper[4867]: I1006 13:21:27.435750 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c52253fc-5075-43f3-81e3-45cdbc49fa52","Type":"ContainerDied","Data":"d6e5bc238916ee18aa99888c7b12bb77fc2dcf388efce2a05c818f4457bfaf40"} Oct 06 13:21:27 crc kubenswrapper[4867]: I1006 13:21:27.437786 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c6e4e3d-0f22-4a7d-9396-0ac732d65496","Type":"ContainerStarted","Data":"ec52b5e3239eb2de61e0a540744cd800a134afc14fcf17c1339d502e8a860797"} Oct 06 13:21:27 crc kubenswrapper[4867]: I1006 13:21:27.441322 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d854b81e-d1da-4eb4-9291-0a98cf04d652","Type":"ContainerStarted","Data":"e971769bba688411b067a0f78eb44b92a4867828187d9ff697ebe1ff5d98a973"} Oct 06 13:21:27 crc kubenswrapper[4867]: W1006 13:21:27.573539 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8819e30_41b5_4fcd_8158_b6b5c178aea9.slice/crio-002c355b508e1ddbe3658075fd90c0d786e7368db465321304d15db9b491dc80 WatchSource:0}: Error finding container 002c355b508e1ddbe3658075fd90c0d786e7368db465321304d15db9b491dc80: Status 404 returned error can't find the container with id 002c355b508e1ddbe3658075fd90c0d786e7368db465321304d15db9b491dc80 Oct 06 13:21:28 crc kubenswrapper[4867]: I1006 13:21:28.456531 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7544c988fc-272d4" event={"ID":"a8819e30-41b5-4fcd-8158-b6b5c178aea9","Type":"ContainerStarted","Data":"002c355b508e1ddbe3658075fd90c0d786e7368db465321304d15db9b491dc80"} Oct 06 13:21:30 crc kubenswrapper[4867]: I1006 13:21:30.496481 4867 generic.go:334] "Generic (PLEG): container finished" podID="3824617d-49df-4851-867f-284560eeaa2c" containerID="232360b71bbd8600afe387f269e707b6841ddf8870e8cdea62fed2b11ab1c732" exitCode=0 Oct 06 13:21:30 crc kubenswrapper[4867]: I1006 13:21:30.496942 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2vv4k" event={"ID":"3824617d-49df-4851-867f-284560eeaa2c","Type":"ContainerDied","Data":"232360b71bbd8600afe387f269e707b6841ddf8870e8cdea62fed2b11ab1c732"} Oct 06 13:21:30 crc kubenswrapper[4867]: I1006 13:21:30.499125 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": read tcp 10.217.0.2:35170->10.217.0.155:9322: read: connection reset by peer" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.525549 4867 generic.go:334] "Generic (PLEG): container finished" podID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerID="d107a15fd03c53fd4f1ecab07a06e4c731affb3d8e38d88a40b5e1a438e91145" exitCode=0 Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.525982 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c52253fc-5075-43f3-81e3-45cdbc49fa52","Type":"ContainerDied","Data":"d107a15fd03c53fd4f1ecab07a06e4c731affb3d8e38d88a40b5e1a438e91145"} Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.737308 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d55bc49dc-hzd62"] Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.774737 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-577bfb968d-pw7pq"] Oct 06 13:21:31 crc kubenswrapper[4867]: E1006 13:21:31.775219 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e31671e-6528-480a-ade5-2f20ca954bad" containerName="init" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.775238 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e31671e-6528-480a-ade5-2f20ca954bad" containerName="init" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.775484 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e31671e-6528-480a-ade5-2f20ca954bad" containerName="init" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.776855 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.782763 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.796433 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77541c32-3bc1-402d-aa9f-924f9b6cb37f-scripts\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.796490 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77541c32-3bc1-402d-aa9f-924f9b6cb37f-config-data\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.796539 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77541c32-3bc1-402d-aa9f-924f9b6cb37f-logs\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.796739 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-combined-ca-bundle\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.802551 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-horizon-secret-key\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.802614 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-horizon-tls-certs\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.802653 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvtnv\" (UniqueName: \"kubernetes.io/projected/77541c32-3bc1-402d-aa9f-924f9b6cb37f-kube-api-access-fvtnv\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.806361 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-577bfb968d-pw7pq"] Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.863406 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7544c988fc-272d4"] Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.881120 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-69d5cf7ffb-c2rgt"] Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.882916 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.891837 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-69d5cf7ffb-c2rgt"] Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.908397 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7e92d5c-74ed-47bc-995a-d3712014f109-config-data\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.908469 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7e92d5c-74ed-47bc-995a-d3712014f109-horizon-tls-certs\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.908528 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-combined-ca-bundle\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.908545 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-horizon-secret-key\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.908566 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d7e92d5c-74ed-47bc-995a-d3712014f109-horizon-secret-key\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.908584 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-horizon-tls-certs\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.908606 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvtnv\" (UniqueName: \"kubernetes.io/projected/77541c32-3bc1-402d-aa9f-924f9b6cb37f-kube-api-access-fvtnv\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.908630 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7e92d5c-74ed-47bc-995a-d3712014f109-combined-ca-bundle\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.908675 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqxz4\" (UniqueName: \"kubernetes.io/projected/d7e92d5c-74ed-47bc-995a-d3712014f109-kube-api-access-wqxz4\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.908699 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77541c32-3bc1-402d-aa9f-924f9b6cb37f-scripts\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.908716 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7e92d5c-74ed-47bc-995a-d3712014f109-scripts\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.908744 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77541c32-3bc1-402d-aa9f-924f9b6cb37f-config-data\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.908781 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77541c32-3bc1-402d-aa9f-924f9b6cb37f-logs\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.908824 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7e92d5c-74ed-47bc-995a-d3712014f109-logs\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.911866 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77541c32-3bc1-402d-aa9f-924f9b6cb37f-logs\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.916435 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77541c32-3bc1-402d-aa9f-924f9b6cb37f-scripts\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.925835 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-horizon-secret-key\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.926046 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-combined-ca-bundle\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.926199 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77541c32-3bc1-402d-aa9f-924f9b6cb37f-config-data\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.926236 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-horizon-tls-certs\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:31 crc kubenswrapper[4867]: I1006 13:21:31.932462 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvtnv\" (UniqueName: \"kubernetes.io/projected/77541c32-3bc1-402d-aa9f-924f9b6cb37f-kube-api-access-fvtnv\") pod \"horizon-577bfb968d-pw7pq\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.011782 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7e92d5c-74ed-47bc-995a-d3712014f109-config-data\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.011911 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7e92d5c-74ed-47bc-995a-d3712014f109-horizon-tls-certs\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.012068 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d7e92d5c-74ed-47bc-995a-d3712014f109-horizon-secret-key\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.012109 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7e92d5c-74ed-47bc-995a-d3712014f109-combined-ca-bundle\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.012196 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxz4\" (UniqueName: \"kubernetes.io/projected/d7e92d5c-74ed-47bc-995a-d3712014f109-kube-api-access-wqxz4\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.012281 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7e92d5c-74ed-47bc-995a-d3712014f109-scripts\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.013430 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7e92d5c-74ed-47bc-995a-d3712014f109-scripts\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.013460 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7e92d5c-74ed-47bc-995a-d3712014f109-logs\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.013743 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7e92d5c-74ed-47bc-995a-d3712014f109-logs\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.014455 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7e92d5c-74ed-47bc-995a-d3712014f109-config-data\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.017750 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d7e92d5c-74ed-47bc-995a-d3712014f109-horizon-secret-key\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.017796 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7e92d5c-74ed-47bc-995a-d3712014f109-horizon-tls-certs\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.030673 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqxz4\" (UniqueName: \"kubernetes.io/projected/d7e92d5c-74ed-47bc-995a-d3712014f109-kube-api-access-wqxz4\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.043037 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7e92d5c-74ed-47bc-995a-d3712014f109-combined-ca-bundle\") pod \"horizon-69d5cf7ffb-c2rgt\" (UID: \"d7e92d5c-74ed-47bc-995a-d3712014f109\") " pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.117052 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.264675 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": dial tcp 10.217.0.155:9322: connect: connection refused" Oct 06 13:21:32 crc kubenswrapper[4867]: I1006 13:21:32.325066 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:21:37 crc kubenswrapper[4867]: I1006 13:21:37.265406 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": dial tcp 10.217.0.155:9322: connect: connection refused" Oct 06 13:21:38 crc kubenswrapper[4867]: E1006 13:21:38.939057 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.151:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Oct 06 13:21:38 crc kubenswrapper[4867]: E1006 13:21:38.939605 4867 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.151:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Oct 06 13:21:38 crc kubenswrapper[4867]: E1006 13:21:38.939779 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:38.102.83.151:5001/podified-master-centos10/openstack-placement-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mjn7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-d8hp7_openstack(4f59ab79-d706-4e1f-9361-6efea6b85568): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 13:21:38 crc kubenswrapper[4867]: E1006 13:21:38.940952 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-d8hp7" podUID="4f59ab79-d706-4e1f-9361-6efea6b85568" Oct 06 13:21:39 crc kubenswrapper[4867]: E1006 13:21:39.627944 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.151:5001/podified-master-centos10/openstack-placement-api:watcher_latest\\\"\"" pod="openstack/placement-db-sync-d8hp7" podUID="4f59ab79-d706-4e1f-9361-6efea6b85568" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.551864 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.660526 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2vv4k" event={"ID":"3824617d-49df-4851-867f-284560eeaa2c","Type":"ContainerDied","Data":"08126750aff912ab741328196c7acd2da237ef4ef917d5aed419fe6156060c91"} Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.661040 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08126750aff912ab741328196c7acd2da237ef4ef917d5aed419fe6156060c91" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.661113 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2vv4k" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.680500 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-credential-keys\") pod \"3824617d-49df-4851-867f-284560eeaa2c\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.680668 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjs64\" (UniqueName: \"kubernetes.io/projected/3824617d-49df-4851-867f-284560eeaa2c-kube-api-access-kjs64\") pod \"3824617d-49df-4851-867f-284560eeaa2c\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.680690 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-config-data\") pod \"3824617d-49df-4851-867f-284560eeaa2c\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.680740 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-scripts\") pod \"3824617d-49df-4851-867f-284560eeaa2c\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.680819 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-fernet-keys\") pod \"3824617d-49df-4851-867f-284560eeaa2c\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.680914 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-combined-ca-bundle\") pod \"3824617d-49df-4851-867f-284560eeaa2c\" (UID: \"3824617d-49df-4851-867f-284560eeaa2c\") " Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.688692 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-scripts" (OuterVolumeSpecName: "scripts") pod "3824617d-49df-4851-867f-284560eeaa2c" (UID: "3824617d-49df-4851-867f-284560eeaa2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.689545 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3824617d-49df-4851-867f-284560eeaa2c" (UID: "3824617d-49df-4851-867f-284560eeaa2c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.692835 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3824617d-49df-4851-867f-284560eeaa2c" (UID: "3824617d-49df-4851-867f-284560eeaa2c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.707115 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3824617d-49df-4851-867f-284560eeaa2c-kube-api-access-kjs64" (OuterVolumeSpecName: "kube-api-access-kjs64") pod "3824617d-49df-4851-867f-284560eeaa2c" (UID: "3824617d-49df-4851-867f-284560eeaa2c"). InnerVolumeSpecName "kube-api-access-kjs64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.711582 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3824617d-49df-4851-867f-284560eeaa2c" (UID: "3824617d-49df-4851-867f-284560eeaa2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.723957 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-config-data" (OuterVolumeSpecName: "config-data") pod "3824617d-49df-4851-867f-284560eeaa2c" (UID: "3824617d-49df-4851-867f-284560eeaa2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.783929 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.783977 4867 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.783987 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjs64\" (UniqueName: \"kubernetes.io/projected/3824617d-49df-4851-867f-284560eeaa2c-kube-api-access-kjs64\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.783999 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.784008 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:42 crc kubenswrapper[4867]: I1006 13:21:42.784018 4867 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3824617d-49df-4851-867f-284560eeaa2c-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.653152 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2vv4k"] Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.659876 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2vv4k"] Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.738338 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-plcg5"] Oct 06 13:21:43 crc kubenswrapper[4867]: E1006 13:21:43.738717 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3824617d-49df-4851-867f-284560eeaa2c" containerName="keystone-bootstrap" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.738739 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3824617d-49df-4851-867f-284560eeaa2c" containerName="keystone-bootstrap" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.738964 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3824617d-49df-4851-867f-284560eeaa2c" containerName="keystone-bootstrap" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.739629 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.746311 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.746613 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.746767 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sjvr2" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.747032 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.750972 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-plcg5"] Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.813619 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7wsb\" (UniqueName: \"kubernetes.io/projected/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-kube-api-access-b7wsb\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.813679 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-combined-ca-bundle\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.813739 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-config-data\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.813763 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-fernet-keys\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.813792 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-scripts\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.813830 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-credential-keys\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.915696 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7wsb\" (UniqueName: \"kubernetes.io/projected/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-kube-api-access-b7wsb\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.915747 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-combined-ca-bundle\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.915784 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-config-data\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.915811 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-fernet-keys\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.915830 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-scripts\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.915865 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-credential-keys\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.919929 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-credential-keys\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.920286 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-config-data\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.920286 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-combined-ca-bundle\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.923371 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-scripts\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.929836 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-fernet-keys\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:43 crc kubenswrapper[4867]: I1006 13:21:43.938631 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7wsb\" (UniqueName: \"kubernetes.io/projected/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-kube-api-access-b7wsb\") pod \"keystone-bootstrap-plcg5\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:44 crc kubenswrapper[4867]: I1006 13:21:44.066343 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:21:45 crc kubenswrapper[4867]: I1006 13:21:45.235896 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3824617d-49df-4851-867f-284560eeaa2c" path="/var/lib/kubelet/pods/3824617d-49df-4851-867f-284560eeaa2c/volumes" Oct 06 13:21:46 crc kubenswrapper[4867]: I1006 13:21:46.698969 4867 generic.go:334] "Generic (PLEG): container finished" podID="82b74021-1aea-4ef5-981a-2b0fc63ec06b" containerID="adf04b731cde07ae684ec0a4ffb7eed577de8c4d9b3e127e57dabb10f30c79d5" exitCode=0 Oct 06 13:21:46 crc kubenswrapper[4867]: I1006 13:21:46.699057 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-js7h4" event={"ID":"82b74021-1aea-4ef5-981a-2b0fc63ec06b","Type":"ContainerDied","Data":"adf04b731cde07ae684ec0a4ffb7eed577de8c4d9b3e127e57dabb10f30c79d5"} Oct 06 13:21:47 crc kubenswrapper[4867]: I1006 13:21:47.264794 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": dial tcp 10.217.0.155:9322: i/o timeout (Client.Timeout exceeded while awaiting headers)" Oct 06 13:21:52 crc kubenswrapper[4867]: I1006 13:21:52.265643 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.184879 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.225467 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-config-data\") pod \"c52253fc-5075-43f3-81e3-45cdbc49fa52\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.225635 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-combined-ca-bundle\") pod \"c52253fc-5075-43f3-81e3-45cdbc49fa52\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.225721 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c52253fc-5075-43f3-81e3-45cdbc49fa52-logs\") pod \"c52253fc-5075-43f3-81e3-45cdbc49fa52\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.226366 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c52253fc-5075-43f3-81e3-45cdbc49fa52-logs" (OuterVolumeSpecName: "logs") pod "c52253fc-5075-43f3-81e3-45cdbc49fa52" (UID: "c52253fc-5075-43f3-81e3-45cdbc49fa52"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.226453 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-custom-prometheus-ca\") pod \"c52253fc-5075-43f3-81e3-45cdbc49fa52\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.226662 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prsnz\" (UniqueName: \"kubernetes.io/projected/c52253fc-5075-43f3-81e3-45cdbc49fa52-kube-api-access-prsnz\") pod \"c52253fc-5075-43f3-81e3-45cdbc49fa52\" (UID: \"c52253fc-5075-43f3-81e3-45cdbc49fa52\") " Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.227839 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c52253fc-5075-43f3-81e3-45cdbc49fa52-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.233382 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c52253fc-5075-43f3-81e3-45cdbc49fa52-kube-api-access-prsnz" (OuterVolumeSpecName: "kube-api-access-prsnz") pod "c52253fc-5075-43f3-81e3-45cdbc49fa52" (UID: "c52253fc-5075-43f3-81e3-45cdbc49fa52"). InnerVolumeSpecName "kube-api-access-prsnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.257056 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c52253fc-5075-43f3-81e3-45cdbc49fa52" (UID: "c52253fc-5075-43f3-81e3-45cdbc49fa52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.277210 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "c52253fc-5075-43f3-81e3-45cdbc49fa52" (UID: "c52253fc-5075-43f3-81e3-45cdbc49fa52"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.298031 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-config-data" (OuterVolumeSpecName: "config-data") pod "c52253fc-5075-43f3-81e3-45cdbc49fa52" (UID: "c52253fc-5075-43f3-81e3-45cdbc49fa52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.329694 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.329735 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.329746 4867 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c52253fc-5075-43f3-81e3-45cdbc49fa52-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.329756 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prsnz\" (UniqueName: \"kubernetes.io/projected/c52253fc-5075-43f3-81e3-45cdbc49fa52-kube-api-access-prsnz\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.750572 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-js7h4" Oct 06 13:21:56 crc kubenswrapper[4867]: E1006 13:21:56.760514 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.151:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Oct 06 13:21:56 crc kubenswrapper[4867]: E1006 13:21:56.760722 4867 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.151:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Oct 06 13:21:56 crc kubenswrapper[4867]: E1006 13:21:56.761327 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:38.102.83.151:5001/podified-master-centos10/openstack-barbican-api:watcher_latest,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p5q2m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-4vzlg_openstack(1ab90007-6383-4ff2-97cc-edb5d7d13d1e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 13:21:56 crc kubenswrapper[4867]: E1006 13:21:56.764273 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-4vzlg" podUID="1ab90007-6383-4ff2-97cc-edb5d7d13d1e" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.835179 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.835178 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c52253fc-5075-43f3-81e3-45cdbc49fa52","Type":"ContainerDied","Data":"d93908f0f7882f100f121ede62c286ea1c2a7aa0cd146e3a31786361a696fc18"} Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.835269 4867 scope.go:117] "RemoveContainer" containerID="d107a15fd03c53fd4f1ecab07a06e4c731affb3d8e38d88a40b5e1a438e91145" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.842923 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b74021-1aea-4ef5-981a-2b0fc63ec06b-combined-ca-bundle\") pod \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\" (UID: \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\") " Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.843191 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kckg\" (UniqueName: \"kubernetes.io/projected/82b74021-1aea-4ef5-981a-2b0fc63ec06b-kube-api-access-2kckg\") pod \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\" (UID: \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\") " Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.843432 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/82b74021-1aea-4ef5-981a-2b0fc63ec06b-config\") pod \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\" (UID: \"82b74021-1aea-4ef5-981a-2b0fc63ec06b\") " Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.847154 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-js7h4" event={"ID":"82b74021-1aea-4ef5-981a-2b0fc63ec06b","Type":"ContainerDied","Data":"b2c3fe939ce96160b6457fec8d475c274f8e14ea45e211d375cdd1d71d53fb66"} Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.847284 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2c3fe939ce96160b6457fec8d475c274f8e14ea45e211d375cdd1d71d53fb66" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.847195 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-js7h4" Oct 06 13:21:56 crc kubenswrapper[4867]: E1006 13:21:56.852133 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.151:5001/podified-master-centos10/openstack-barbican-api:watcher_latest\\\"\"" pod="openstack/barbican-db-sync-4vzlg" podUID="1ab90007-6383-4ff2-97cc-edb5d7d13d1e" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.861564 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82b74021-1aea-4ef5-981a-2b0fc63ec06b-kube-api-access-2kckg" (OuterVolumeSpecName: "kube-api-access-2kckg") pod "82b74021-1aea-4ef5-981a-2b0fc63ec06b" (UID: "82b74021-1aea-4ef5-981a-2b0fc63ec06b"). InnerVolumeSpecName "kube-api-access-2kckg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.878448 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b74021-1aea-4ef5-981a-2b0fc63ec06b-config" (OuterVolumeSpecName: "config") pod "82b74021-1aea-4ef5-981a-2b0fc63ec06b" (UID: "82b74021-1aea-4ef5-981a-2b0fc63ec06b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.887318 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b74021-1aea-4ef5-981a-2b0fc63ec06b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82b74021-1aea-4ef5-981a-2b0fc63ec06b" (UID: "82b74021-1aea-4ef5-981a-2b0fc63ec06b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.904505 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.927384 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.945512 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:21:56 crc kubenswrapper[4867]: E1006 13:21:56.946021 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api-log" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.946044 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api-log" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.946084 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kckg\" (UniqueName: \"kubernetes.io/projected/82b74021-1aea-4ef5-981a-2b0fc63ec06b-kube-api-access-2kckg\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.946111 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/82b74021-1aea-4ef5-981a-2b0fc63ec06b-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.946125 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b74021-1aea-4ef5-981a-2b0fc63ec06b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:56 crc kubenswrapper[4867]: E1006 13:21:56.946092 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b74021-1aea-4ef5-981a-2b0fc63ec06b" containerName="neutron-db-sync" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.946150 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b74021-1aea-4ef5-981a-2b0fc63ec06b" containerName="neutron-db-sync" Oct 06 13:21:56 crc kubenswrapper[4867]: E1006 13:21:56.946173 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.946180 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.946704 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api-log" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.946737 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b74021-1aea-4ef5-981a-2b0fc63ec06b" containerName="neutron-db-sync" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.946761 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.948590 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.954080 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 06 13:21:56 crc kubenswrapper[4867]: I1006 13:21:56.961211 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.048390 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-config-data\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.048811 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.048853 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-logs\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.048979 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vht4\" (UniqueName: \"kubernetes.io/projected/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-kube-api-access-9vht4\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.049031 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.151758 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.152011 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-config-data\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.152106 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.152174 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-logs\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.152230 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vht4\" (UniqueName: \"kubernetes.io/projected/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-kube-api-access-9vht4\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.152770 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-logs\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.156298 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.157173 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-config-data\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.161489 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.178810 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vht4\" (UniqueName: \"kubernetes.io/projected/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-kube-api-access-9vht4\") pod \"watcher-api-0\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.233740 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" path="/var/lib/kubelet/pods/c52253fc-5075-43f3-81e3-45cdbc49fa52/volumes" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.267041 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 13:21:57 crc kubenswrapper[4867]: I1006 13:21:57.267632 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="c52253fc-5075-43f3-81e3-45cdbc49fa52" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": dial tcp 10.217.0.155:9322: i/o timeout (Client.Timeout exceeded while awaiting headers)" Oct 06 13:21:57 crc kubenswrapper[4867]: E1006 13:21:57.984776 4867 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.151:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 06 13:21:57 crc kubenswrapper[4867]: E1006 13:21:57.985184 4867 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.151:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 06 13:21:57 crc kubenswrapper[4867]: E1006 13:21:57.985896 4867 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.151:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2s5k9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-m8h8z_openstack(69ef6013-a982-45c6-8fc8-46c11fead4a7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 13:21:57 crc kubenswrapper[4867]: E1006 13:21:57.991127 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-m8h8z" podUID="69ef6013-a982-45c6-8fc8-46c11fead4a7" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.064100 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5df64d6755-5gzc7"] Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.124584 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86dbf54f95-fr228"] Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.127438 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.153811 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5f7bcb84f4-pcrvc"] Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.167315 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.172748 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-c74pn" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.172956 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.184588 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.184818 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.186044 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-dns-svc\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.186088 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-ovsdbserver-sb\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.186114 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-ovsdbserver-nb\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.186136 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-dns-swift-storage-0\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.186229 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsb6m\" (UniqueName: \"kubernetes.io/projected/a4334a9a-d6f0-418a-958e-755336a58527-kube-api-access-fsb6m\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.186311 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-config\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.210193 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86dbf54f95-fr228"] Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.222614 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f7bcb84f4-pcrvc"] Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.288184 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-config\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.289446 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-dns-svc\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.289471 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-ovsdbserver-sb\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.289179 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-config\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.289497 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-ovsdbserver-nb\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.289595 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-httpd-config\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.289622 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-combined-ca-bundle\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.289658 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-dns-swift-storage-0\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.289862 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-ovndb-tls-certs\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.289883 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s6ms\" (UniqueName: \"kubernetes.io/projected/f6218a59-1db5-4438-9fda-7781c1d4978b-kube-api-access-9s6ms\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.290063 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsb6m\" (UniqueName: \"kubernetes.io/projected/a4334a9a-d6f0-418a-958e-755336a58527-kube-api-access-fsb6m\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.290095 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-config\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.290195 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-ovsdbserver-nb\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.290777 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-dns-swift-storage-0\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.291304 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-dns-svc\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.291921 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-ovsdbserver-sb\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.317280 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsb6m\" (UniqueName: \"kubernetes.io/projected/a4334a9a-d6f0-418a-958e-755336a58527-kube-api-access-fsb6m\") pod \"dnsmasq-dns-86dbf54f95-fr228\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.343719 4867 scope.go:117] "RemoveContainer" containerID="d6e5bc238916ee18aa99888c7b12bb77fc2dcf388efce2a05c818f4457bfaf40" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.392671 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-config\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.392794 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-httpd-config\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.392817 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-combined-ca-bundle\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.392899 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-ovndb-tls-certs\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.392924 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s6ms\" (UniqueName: \"kubernetes.io/projected/f6218a59-1db5-4438-9fda-7781c1d4978b-kube-api-access-9s6ms\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.401808 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-httpd-config\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.408741 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-ovndb-tls-certs\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.414754 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-config\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.415728 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s6ms\" (UniqueName: \"kubernetes.io/projected/f6218a59-1db5-4438-9fda-7781c1d4978b-kube-api-access-9s6ms\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.419925 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-combined-ca-bundle\") pod \"neutron-5f7bcb84f4-pcrvc\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.512602 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.555635 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.642772 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-577bfb968d-pw7pq"] Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.801674 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-69d5cf7ffb-c2rgt"] Oct 06 13:21:58 crc kubenswrapper[4867]: W1006 13:21:58.801893 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77541c32_3bc1_402d_aa9f_924f9b6cb37f.slice/crio-cbc9a756b5a13c59f64bb398bcaa677f88067b457d4449eb3e52985843e7d53d WatchSource:0}: Error finding container cbc9a756b5a13c59f64bb398bcaa677f88067b457d4449eb3e52985843e7d53d: Status 404 returned error can't find the container with id cbc9a756b5a13c59f64bb398bcaa677f88067b457d4449eb3e52985843e7d53d Oct 06 13:21:58 crc kubenswrapper[4867]: W1006 13:21:58.844321 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7e92d5c_74ed_47bc_995a_d3712014f109.slice/crio-f05570ebc9b6aa40506f5283d46f2f33497d09c530bd79172d93f26288fe2aa6 WatchSource:0}: Error finding container f05570ebc9b6aa40506f5283d46f2f33497d09c530bd79172d93f26288fe2aa6: Status 404 returned error can't find the container with id f05570ebc9b6aa40506f5283d46f2f33497d09c530bd79172d93f26288fe2aa6 Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.878686 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d5cf7ffb-c2rgt" event={"ID":"d7e92d5c-74ed-47bc-995a-d3712014f109","Type":"ContainerStarted","Data":"f05570ebc9b6aa40506f5283d46f2f33497d09c530bd79172d93f26288fe2aa6"} Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.882001 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" podUID="60344bb3-1d96-49fc-bf7b-2fe9452160d3" containerName="dnsmasq-dns" containerID="cri-o://7226a25a135ba7f0ecc6b7fcd707aff9e91d93e9167c97057d207d56e96e9afb" gracePeriod=10 Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.882346 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" event={"ID":"60344bb3-1d96-49fc-bf7b-2fe9452160d3","Type":"ContainerStarted","Data":"7226a25a135ba7f0ecc6b7fcd707aff9e91d93e9167c97057d207d56e96e9afb"} Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.882391 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.892922 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-577bfb968d-pw7pq" event={"ID":"77541c32-3bc1-402d-aa9f-924f9b6cb37f","Type":"ContainerStarted","Data":"cbc9a756b5a13c59f64bb398bcaa677f88067b457d4449eb3e52985843e7d53d"} Oct 06 13:21:58 crc kubenswrapper[4867]: E1006 13:21:58.895435 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.151:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-m8h8z" podUID="69ef6013-a982-45c6-8fc8-46c11fead4a7" Oct 06 13:21:58 crc kubenswrapper[4867]: I1006 13:21:58.908395 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" podStartSLOduration=37.908372358 podStartE2EDuration="37.908372358s" podCreationTimestamp="2025-10-06 13:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:21:58.902498027 +0000 UTC m=+1098.360446171" watchObservedRunningTime="2025-10-06 13:21:58.908372358 +0000 UTC m=+1098.366320502" Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.198169 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.337136 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-plcg5"] Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.739008 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86dbf54f95-fr228"] Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.923359 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-plcg5" event={"ID":"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255","Type":"ContainerStarted","Data":"1912d9b772b7f62d25ab984e675b7568bd0a83dd566168324569e067e2a2fe0d"} Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.929544 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b","Type":"ContainerStarted","Data":"265e83a1b11783d3c534499e5ac8cc34db6d87b63cdf55ae18e186bb816f9c6e"} Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.929986 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.953477 4867 generic.go:334] "Generic (PLEG): container finished" podID="60344bb3-1d96-49fc-bf7b-2fe9452160d3" containerID="7226a25a135ba7f0ecc6b7fcd707aff9e91d93e9167c97057d207d56e96e9afb" exitCode=0 Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.953603 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" event={"ID":"60344bb3-1d96-49fc-bf7b-2fe9452160d3","Type":"ContainerDied","Data":"7226a25a135ba7f0ecc6b7fcd707aff9e91d93e9167c97057d207d56e96e9afb"} Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.953649 4867 scope.go:117] "RemoveContainer" containerID="7226a25a135ba7f0ecc6b7fcd707aff9e91d93e9167c97057d207d56e96e9afb" Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.967016 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-dns-swift-storage-0\") pod \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.967083 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm624\" (UniqueName: \"kubernetes.io/projected/60344bb3-1d96-49fc-bf7b-2fe9452160d3-kube-api-access-vm624\") pod \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.967229 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-dns-svc\") pod \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.967457 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-config\") pod \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.967491 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-ovsdbserver-sb\") pod \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.967557 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-ovsdbserver-nb\") pod \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\" (UID: \"60344bb3-1d96-49fc-bf7b-2fe9452160d3\") " Oct 06 13:21:59 crc kubenswrapper[4867]: I1006 13:21:59.991581 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dbf54f95-fr228" event={"ID":"a4334a9a-d6f0-418a-958e-755336a58527","Type":"ContainerStarted","Data":"8ef2deda54353d99c01f0aa42c5dd7c30961dea9ae094468e5ab0fab4a14f505"} Oct 06 13:22:00 crc kubenswrapper[4867]: I1006 13:22:00.009403 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d55bc49dc-hzd62" event={"ID":"ef1f9710-07da-4226-befb-73474a496cae","Type":"ContainerStarted","Data":"55fe2cb95aabb0ee7634f115dd8c77c66b892c09e51aab6c222f89e9ef0a91db"} Oct 06 13:22:00 crc kubenswrapper[4867]: I1006 13:22:00.012725 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01","Type":"ContainerStarted","Data":"81b98af574ed60b3383cf920f2c60cfd94b5ed4a16ff5124ee45daada62be0c2"} Oct 06 13:22:00 crc kubenswrapper[4867]: I1006 13:22:00.015855 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3","Type":"ContainerStarted","Data":"469cdde25e5de2f983012628bf0abd04ecd346dda8914428572b0c7bc1a08662"} Oct 06 13:22:00 crc kubenswrapper[4867]: I1006 13:22:00.025960 4867 scope.go:117] "RemoveContainer" containerID="9dc386b8a5ff32df5970821557cf49b137d0ceb352127e977a5d961a74f562c2" Oct 06 13:22:00 crc kubenswrapper[4867]: I1006 13:22:00.044097 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=7.406622331 podStartE2EDuration="40.04407483s" podCreationTimestamp="2025-10-06 13:21:20 +0000 UTC" firstStartedPulling="2025-10-06 13:21:23.355215704 +0000 UTC m=+1062.813163848" lastFinishedPulling="2025-10-06 13:21:55.992668183 +0000 UTC m=+1095.450616347" observedRunningTime="2025-10-06 13:22:00.036901614 +0000 UTC m=+1099.494849758" watchObservedRunningTime="2025-10-06 13:22:00.04407483 +0000 UTC m=+1099.502022974" Oct 06 13:22:00 crc kubenswrapper[4867]: I1006 13:22:00.048136 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60344bb3-1d96-49fc-bf7b-2fe9452160d3-kube-api-access-vm624" (OuterVolumeSpecName: "kube-api-access-vm624") pod "60344bb3-1d96-49fc-bf7b-2fe9452160d3" (UID: "60344bb3-1d96-49fc-bf7b-2fe9452160d3"). InnerVolumeSpecName "kube-api-access-vm624". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:00 crc kubenswrapper[4867]: I1006 13:22:00.058548 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f7bcb84f4-pcrvc"] Oct 06 13:22:00 crc kubenswrapper[4867]: I1006 13:22:00.073863 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm624\" (UniqueName: \"kubernetes.io/projected/60344bb3-1d96-49fc-bf7b-2fe9452160d3-kube-api-access-vm624\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:00 crc kubenswrapper[4867]: I1006 13:22:00.816083 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "60344bb3-1d96-49fc-bf7b-2fe9452160d3" (UID: "60344bb3-1d96-49fc-bf7b-2fe9452160d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:00 crc kubenswrapper[4867]: I1006 13:22:00.864156 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-config" (OuterVolumeSpecName: "config") pod "60344bb3-1d96-49fc-bf7b-2fe9452160d3" (UID: "60344bb3-1d96-49fc-bf7b-2fe9452160d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:00 crc kubenswrapper[4867]: I1006 13:22:00.922200 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:00 crc kubenswrapper[4867]: I1006 13:22:00.970384 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:00 crc kubenswrapper[4867]: I1006 13:22:00.958183 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "60344bb3-1d96-49fc-bf7b-2fe9452160d3" (UID: "60344bb3-1d96-49fc-bf7b-2fe9452160d3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:00 crc kubenswrapper[4867]: I1006 13:22:00.958340 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "60344bb3-1d96-49fc-bf7b-2fe9452160d3" (UID: "60344bb3-1d96-49fc-bf7b-2fe9452160d3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.000555 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "60344bb3-1d96-49fc-bf7b-2fe9452160d3" (UID: "60344bb3-1d96-49fc-bf7b-2fe9452160d3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.055515 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d8hp7" event={"ID":"4f59ab79-d706-4e1f-9361-6efea6b85568","Type":"ContainerStarted","Data":"bf267a34fc01a7e04ddf7818b14c07aaa19939c20716b1b7141e26d95cb6957f"} Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.063737 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7544c988fc-272d4" event={"ID":"a8819e30-41b5-4fcd-8158-b6b5c178aea9","Type":"ContainerStarted","Data":"c3602b6f7c4d145e16b5b527d41d0029e81afe01378928c968b1afc7e5231e8d"} Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.071689 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.071903 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.071993 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60344bb3-1d96-49fc-bf7b-2fe9452160d3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.074491 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58966f7699-hjfh4" event={"ID":"2a642d87-db23-4d12-90ac-ebdfbfe00996","Type":"ContainerStarted","Data":"637ed67407d0d2cc300c14f7387d0fd2855bd868a43ad5c7bad18ba2e888ab7f"} Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.086791 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-d8hp7" podStartSLOduration=5.539513452 podStartE2EDuration="40.086770591s" podCreationTimestamp="2025-10-06 13:21:21 +0000 UTC" firstStartedPulling="2025-10-06 13:21:23.853625542 +0000 UTC m=+1063.311573686" lastFinishedPulling="2025-10-06 13:21:58.400882671 +0000 UTC m=+1097.858830825" observedRunningTime="2025-10-06 13:22:01.080917171 +0000 UTC m=+1100.538865315" watchObservedRunningTime="2025-10-06 13:22:01.086770591 +0000 UTC m=+1100.544718735" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.098440 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d854b81e-d1da-4eb4-9291-0a98cf04d652","Type":"ContainerStarted","Data":"f59f5ff42c1c8095b281e59032bcc653ab2360372ada4561201d9c5927099d85"} Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.098630 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d854b81e-d1da-4eb4-9291-0a98cf04d652" containerName="glance-log" containerID="cri-o://e971769bba688411b067a0f78eb44b92a4867828187d9ff697ebe1ff5d98a973" gracePeriod=30 Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.098904 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d854b81e-d1da-4eb4-9291-0a98cf04d652" containerName="glance-httpd" containerID="cri-o://f59f5ff42c1c8095b281e59032bcc653ab2360372ada4561201d9c5927099d85" gracePeriod=30 Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.105137 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-plcg5" event={"ID":"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255","Type":"ContainerStarted","Data":"b156a56991af149be07f68a2d46a2d3973ab6eb4e2e2b1fb6f24753c0d2e173a"} Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.126057 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d5cf7ffb-c2rgt" event={"ID":"d7e92d5c-74ed-47bc-995a-d3712014f109","Type":"ContainerStarted","Data":"71a25cd8746dff659fa84b7d02abb69e2e0eb14574a0896955baef2f7d08e1ec"} Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.136039 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c47455745-hd5zg"] Oct 06 13:22:01 crc kubenswrapper[4867]: E1006 13:22:01.136780 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60344bb3-1d96-49fc-bf7b-2fe9452160d3" containerName="dnsmasq-dns" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.136847 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="60344bb3-1d96-49fc-bf7b-2fe9452160d3" containerName="dnsmasq-dns" Oct 06 13:22:01 crc kubenswrapper[4867]: E1006 13:22:01.136929 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60344bb3-1d96-49fc-bf7b-2fe9452160d3" containerName="init" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.136986 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="60344bb3-1d96-49fc-bf7b-2fe9452160d3" containerName="init" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.137281 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="60344bb3-1d96-49fc-bf7b-2fe9452160d3" containerName="dnsmasq-dns" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.138491 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.139545 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.139684 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" event={"ID":"60344bb3-1d96-49fc-bf7b-2fe9452160d3","Type":"ContainerDied","Data":"d6aafe20a37b6b66ce19777b5dde436fc1df9a858668e8c74c96e310b24a8a55"} Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.141995 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.142307 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.153778 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-577bfb968d-pw7pq" event={"ID":"77541c32-3bc1-402d-aa9f-924f9b6cb37f","Type":"ContainerStarted","Data":"5730aa1cc45c4bf8647b15c0406023a7d7a198e5c1a015a4e43025386176f686"} Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.154089 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=39.154070351 podStartE2EDuration="39.154070351s" podCreationTimestamp="2025-10-06 13:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:01.149806844 +0000 UTC m=+1100.607754988" watchObservedRunningTime="2025-10-06 13:22:01.154070351 +0000 UTC m=+1100.612018495" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.165611 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d55bc49dc-hzd62" event={"ID":"ef1f9710-07da-4226-befb-73474a496cae","Type":"ContainerStarted","Data":"071906446a1f7b1b30a9be47203da070f524eb694ef9e09aa6dfdc6449357c85"} Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.165794 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d55bc49dc-hzd62" podUID="ef1f9710-07da-4226-befb-73474a496cae" containerName="horizon-log" containerID="cri-o://55fe2cb95aabb0ee7634f115dd8c77c66b892c09e51aab6c222f89e9ef0a91db" gracePeriod=30 Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.166099 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d55bc49dc-hzd62" podUID="ef1f9710-07da-4226-befb-73474a496cae" containerName="horizon" containerID="cri-o://071906446a1f7b1b30a9be47203da070f524eb694ef9e09aa6dfdc6449357c85" gracePeriod=30 Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.173904 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c47455745-hd5zg"] Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.217673 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c6e4e3d-0f22-4a7d-9396-0ac732d65496","Type":"ContainerStarted","Data":"44b058bbf0fc727a5870ef13185982a68418e5e0c1576ec80e20359f8ab43f5f"} Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.217915 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3c6e4e3d-0f22-4a7d-9396-0ac732d65496" containerName="glance-log" containerID="cri-o://ec52b5e3239eb2de61e0a540744cd800a134afc14fcf17c1339d502e8a860797" gracePeriod=30 Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.218268 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3c6e4e3d-0f22-4a7d-9396-0ac732d65496" containerName="glance-httpd" containerID="cri-o://44b058bbf0fc727a5870ef13185982a68418e5e0c1576ec80e20359f8ab43f5f" gracePeriod=30 Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.242970 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-plcg5" podStartSLOduration=18.242943461 podStartE2EDuration="18.242943461s" podCreationTimestamp="2025-10-06 13:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:01.219068358 +0000 UTC m=+1100.677016502" watchObservedRunningTime="2025-10-06 13:22:01.242943461 +0000 UTC m=+1100.700891605" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.262046 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=40.262021583 podStartE2EDuration="40.262021583s" podCreationTimestamp="2025-10-06 13:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:01.258090175 +0000 UTC m=+1100.716038319" watchObservedRunningTime="2025-10-06 13:22:01.262021583 +0000 UTC m=+1100.719969727" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.276945 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-httpd-config\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.276988 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-config\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.277051 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-ovndb-tls-certs\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.277119 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-internal-tls-certs\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.277139 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-combined-ca-bundle\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.277165 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-public-tls-certs\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.277227 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htwkd\" (UniqueName: \"kubernetes.io/projected/81a1b704-8648-453e-b052-9a2721cf9830-kube-api-access-htwkd\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.303301 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7d55bc49dc-hzd62" podStartSLOduration=6.13080816 podStartE2EDuration="40.303275501s" podCreationTimestamp="2025-10-06 13:21:21 +0000 UTC" firstStartedPulling="2025-10-06 13:21:23.721036617 +0000 UTC m=+1063.178984761" lastFinishedPulling="2025-10-06 13:21:57.893503958 +0000 UTC m=+1097.351452102" observedRunningTime="2025-10-06 13:22:01.286859002 +0000 UTC m=+1100.744807326" watchObservedRunningTime="2025-10-06 13:22:01.303275501 +0000 UTC m=+1100.761223665" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.339998 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=8.576760137 podStartE2EDuration="41.339977894s" podCreationTimestamp="2025-10-06 13:21:20 +0000 UTC" firstStartedPulling="2025-10-06 13:21:23.301135976 +0000 UTC m=+1062.759084120" lastFinishedPulling="2025-10-06 13:21:56.064353713 +0000 UTC m=+1095.522301877" observedRunningTime="2025-10-06 13:22:01.339499561 +0000 UTC m=+1100.797447695" watchObservedRunningTime="2025-10-06 13:22:01.339977894 +0000 UTC m=+1100.797926038" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.379235 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-internal-tls-certs\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.379296 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-combined-ca-bundle\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.379350 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-public-tls-certs\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.379474 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htwkd\" (UniqueName: \"kubernetes.io/projected/81a1b704-8648-453e-b052-9a2721cf9830-kube-api-access-htwkd\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.379529 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-httpd-config\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.379555 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-config\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.379647 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-ovndb-tls-certs\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.396779 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-httpd-config\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.398136 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-internal-tls-certs\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.398550 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-combined-ca-bundle\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.408587 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-public-tls-certs\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.409881 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htwkd\" (UniqueName: \"kubernetes.io/projected/81a1b704-8648-453e-b052-9a2721cf9830-kube-api-access-htwkd\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.426685 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-ovndb-tls-certs\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.429001 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/81a1b704-8648-453e-b052-9a2721cf9830-config\") pod \"neutron-c47455745-hd5zg\" (UID: \"81a1b704-8648-453e-b052-9a2721cf9830\") " pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.519611 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"78044486-0042-4b06-ae63-0dbfe9b873ce","Type":"ContainerStarted","Data":"2b4f928591147b2ce108497d46ac17448a1adb7c077dee072960152f6d4f56b7"} Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.519690 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f7bcb84f4-pcrvc" event={"ID":"f6218a59-1db5-4438-9fda-7781c1d4978b","Type":"ContainerStarted","Data":"8c79e3d74cff0ba121ff474bc02fbab6af712544ad541b963a997aa7d34c54be"} Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.519709 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b","Type":"ContainerStarted","Data":"9cc4014c17d550097efe2207ca0ce53361ab7e1be6cc2bccd4ec76c4248e8ab8"} Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.615015 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.622568 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.622628 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.642111 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.662962 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 06 13:22:01 crc kubenswrapper[4867]: I1006 13:22:01.671082 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.050972 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.405166 4867 generic.go:334] "Generic (PLEG): container finished" podID="d854b81e-d1da-4eb4-9291-0a98cf04d652" containerID="f59f5ff42c1c8095b281e59032bcc653ab2360372ada4561201d9c5927099d85" exitCode=0 Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.405206 4867 generic.go:334] "Generic (PLEG): container finished" podID="d854b81e-d1da-4eb4-9291-0a98cf04d652" containerID="e971769bba688411b067a0f78eb44b92a4867828187d9ff697ebe1ff5d98a973" exitCode=143 Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.405276 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d854b81e-d1da-4eb4-9291-0a98cf04d652","Type":"ContainerDied","Data":"f59f5ff42c1c8095b281e59032bcc653ab2360372ada4561201d9c5927099d85"} Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.405315 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d854b81e-d1da-4eb4-9291-0a98cf04d652","Type":"ContainerDied","Data":"e971769bba688411b067a0f78eb44b92a4867828187d9ff697ebe1ff5d98a973"} Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.445968 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b","Type":"ContainerStarted","Data":"d2a9f0156e720f2b5bf975b13afd79e9cfdd4b1fbf4cb328c04042ca361342fa"} Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.446662 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.448481 4867 generic.go:334] "Generic (PLEG): container finished" podID="a4334a9a-d6f0-418a-958e-755336a58527" containerID="fc7e3f04f451d1fcc1bc448b4c361da1e444b060f8f92f675eb101095cfcaddb" exitCode=0 Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.448567 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dbf54f95-fr228" event={"ID":"a4334a9a-d6f0-418a-958e-755336a58527","Type":"ContainerDied","Data":"fc7e3f04f451d1fcc1bc448b4c361da1e444b060f8f92f675eb101095cfcaddb"} Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.458183 4867 generic.go:334] "Generic (PLEG): container finished" podID="3c6e4e3d-0f22-4a7d-9396-0ac732d65496" containerID="44b058bbf0fc727a5870ef13185982a68418e5e0c1576ec80e20359f8ab43f5f" exitCode=0 Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.458223 4867 generic.go:334] "Generic (PLEG): container finished" podID="3c6e4e3d-0f22-4a7d-9396-0ac732d65496" containerID="ec52b5e3239eb2de61e0a540744cd800a134afc14fcf17c1339d502e8a860797" exitCode=143 Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.458300 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c6e4e3d-0f22-4a7d-9396-0ac732d65496","Type":"ContainerDied","Data":"44b058bbf0fc727a5870ef13185982a68418e5e0c1576ec80e20359f8ab43f5f"} Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.458337 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c6e4e3d-0f22-4a7d-9396-0ac732d65496","Type":"ContainerDied","Data":"ec52b5e3239eb2de61e0a540744cd800a134afc14fcf17c1339d502e8a860797"} Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.473207 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7544c988fc-272d4" podUID="a8819e30-41b5-4fcd-8158-b6b5c178aea9" containerName="horizon-log" containerID="cri-o://c3602b6f7c4d145e16b5b527d41d0029e81afe01378928c968b1afc7e5231e8d" gracePeriod=30 Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.475127 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7544c988fc-272d4" event={"ID":"a8819e30-41b5-4fcd-8158-b6b5c178aea9","Type":"ContainerStarted","Data":"587f0c4f50a3a37d391a63eb68d9c5222ef98e7a6671e9717157a45d35f1b8a5"} Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.475368 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7544c988fc-272d4" podUID="a8819e30-41b5-4fcd-8158-b6b5c178aea9" containerName="horizon" containerID="cri-o://587f0c4f50a3a37d391a63eb68d9c5222ef98e7a6671e9717157a45d35f1b8a5" gracePeriod=30 Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.476125 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.500526 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=6.500488056 podStartE2EDuration="6.500488056s" podCreationTimestamp="2025-10-06 13:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:02.473849658 +0000 UTC m=+1101.931797802" watchObservedRunningTime="2025-10-06 13:22:02.500488056 +0000 UTC m=+1101.958436210" Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.546143 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7544c988fc-272d4" podStartSLOduration=7.735106765 podStartE2EDuration="38.546104953s" podCreationTimestamp="2025-10-06 13:21:24 +0000 UTC" firstStartedPulling="2025-10-06 13:21:27.577530905 +0000 UTC m=+1067.035479049" lastFinishedPulling="2025-10-06 13:21:58.388529093 +0000 UTC m=+1097.846477237" observedRunningTime="2025-10-06 13:22:02.538926977 +0000 UTC m=+1101.996875121" watchObservedRunningTime="2025-10-06 13:22:02.546104953 +0000 UTC m=+1102.004053097" Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.831636 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.849419 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.910518 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:22:02 crc kubenswrapper[4867]: I1006 13:22:02.979763 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.159149 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.216063 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.259639 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-scripts\") pod \"d854b81e-d1da-4eb4-9291-0a98cf04d652\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.259811 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d854b81e-d1da-4eb4-9291-0a98cf04d652-logs\") pod \"d854b81e-d1da-4eb4-9291-0a98cf04d652\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.259842 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45xbn\" (UniqueName: \"kubernetes.io/projected/d854b81e-d1da-4eb4-9291-0a98cf04d652-kube-api-access-45xbn\") pod \"d854b81e-d1da-4eb4-9291-0a98cf04d652\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.259934 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-config-data\") pod \"d854b81e-d1da-4eb4-9291-0a98cf04d652\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.259984 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-combined-ca-bundle\") pod \"d854b81e-d1da-4eb4-9291-0a98cf04d652\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.260015 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d854b81e-d1da-4eb4-9291-0a98cf04d652-httpd-run\") pod \"d854b81e-d1da-4eb4-9291-0a98cf04d652\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.260111 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"d854b81e-d1da-4eb4-9291-0a98cf04d652\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.260137 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-public-tls-certs\") pod \"d854b81e-d1da-4eb4-9291-0a98cf04d652\" (UID: \"d854b81e-d1da-4eb4-9291-0a98cf04d652\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.263582 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d854b81e-d1da-4eb4-9291-0a98cf04d652-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d854b81e-d1da-4eb4-9291-0a98cf04d652" (UID: "d854b81e-d1da-4eb4-9291-0a98cf04d652"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.264962 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d854b81e-d1da-4eb4-9291-0a98cf04d652-logs" (OuterVolumeSpecName: "logs") pod "d854b81e-d1da-4eb4-9291-0a98cf04d652" (UID: "d854b81e-d1da-4eb4-9291-0a98cf04d652"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.288181 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d854b81e-d1da-4eb4-9291-0a98cf04d652-kube-api-access-45xbn" (OuterVolumeSpecName: "kube-api-access-45xbn") pod "d854b81e-d1da-4eb4-9291-0a98cf04d652" (UID: "d854b81e-d1da-4eb4-9291-0a98cf04d652"). InnerVolumeSpecName "kube-api-access-45xbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.303893 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "d854b81e-d1da-4eb4-9291-0a98cf04d652" (UID: "d854b81e-d1da-4eb4-9291-0a98cf04d652"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.303877 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-scripts" (OuterVolumeSpecName: "scripts") pod "d854b81e-d1da-4eb4-9291-0a98cf04d652" (UID: "d854b81e-d1da-4eb4-9291-0a98cf04d652"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.362330 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-scripts\") pod \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.363024 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-config-data\") pod \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.363076 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-httpd-run\") pod \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.363142 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.363206 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-combined-ca-bundle\") pod \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.363312 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-internal-tls-certs\") pod \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.363352 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-logs\") pod \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.363395 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fwf2\" (UniqueName: \"kubernetes.io/projected/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-kube-api-access-4fwf2\") pod \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\" (UID: \"3c6e4e3d-0f22-4a7d-9396-0ac732d65496\") " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.364094 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d854b81e-d1da-4eb4-9291-0a98cf04d652-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.364118 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45xbn\" (UniqueName: \"kubernetes.io/projected/d854b81e-d1da-4eb4-9291-0a98cf04d652-kube-api-access-45xbn\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.364129 4867 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d854b81e-d1da-4eb4-9291-0a98cf04d652-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.364151 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.364161 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.375416 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-logs" (OuterVolumeSpecName: "logs") pod "3c6e4e3d-0f22-4a7d-9396-0ac732d65496" (UID: "3c6e4e3d-0f22-4a7d-9396-0ac732d65496"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.380813 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3c6e4e3d-0f22-4a7d-9396-0ac732d65496" (UID: "3c6e4e3d-0f22-4a7d-9396-0ac732d65496"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.395644 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-scripts" (OuterVolumeSpecName: "scripts") pod "3c6e4e3d-0f22-4a7d-9396-0ac732d65496" (UID: "3c6e4e3d-0f22-4a7d-9396-0ac732d65496"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.402109 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "3c6e4e3d-0f22-4a7d-9396-0ac732d65496" (UID: "3c6e4e3d-0f22-4a7d-9396-0ac732d65496"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.402473 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-kube-api-access-4fwf2" (OuterVolumeSpecName: "kube-api-access-4fwf2") pod "3c6e4e3d-0f22-4a7d-9396-0ac732d65496" (UID: "3c6e4e3d-0f22-4a7d-9396-0ac732d65496"). InnerVolumeSpecName "kube-api-access-4fwf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.471590 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.471647 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.471662 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fwf2\" (UniqueName: \"kubernetes.io/projected/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-kube-api-access-4fwf2\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.471677 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.471689 4867 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.510134 4867 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.512658 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d854b81e-d1da-4eb4-9291-0a98cf04d652" (UID: "d854b81e-d1da-4eb4-9291-0a98cf04d652"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.518561 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c6e4e3d-0f22-4a7d-9396-0ac732d65496" (UID: "3c6e4e3d-0f22-4a7d-9396-0ac732d65496"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.549463 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.559926 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58966f7699-hjfh4" podUID="2a642d87-db23-4d12-90ac-ebdfbfe00996" containerName="horizon-log" containerID="cri-o://637ed67407d0d2cc300c14f7387d0fd2855bd868a43ad5c7bad18ba2e888ab7f" gracePeriod=30 Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.560619 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58966f7699-hjfh4" podUID="2a642d87-db23-4d12-90ac-ebdfbfe00996" containerName="horizon" containerID="cri-o://60499b97d5c35d9b94dd6e0ffb01e9156330c4d6a9e670fc2b52d63436d243f7" gracePeriod=30 Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.573272 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.573300 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.573310 4867 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.600822 4867 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.610219 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.613272 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-577bfb968d-pw7pq" podStartSLOduration=32.613233981 podStartE2EDuration="32.613233981s" podCreationTimestamp="2025-10-06 13:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:03.594038156 +0000 UTC m=+1103.051986320" watchObservedRunningTime="2025-10-06 13:22:03.613233981 +0000 UTC m=+1103.071182125" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.619215 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d854b81e-d1da-4eb4-9291-0a98cf04d652" (UID: "d854b81e-d1da-4eb4-9291-0a98cf04d652"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.675724 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-config-data" (OuterVolumeSpecName: "config-data") pod "d854b81e-d1da-4eb4-9291-0a98cf04d652" (UID: "d854b81e-d1da-4eb4-9291-0a98cf04d652"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.684216 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.684272 4867 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.684285 4867 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d854b81e-d1da-4eb4-9291-0a98cf04d652-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.685221 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3c6e4e3d-0f22-4a7d-9396-0ac732d65496" (UID: "3c6e4e3d-0f22-4a7d-9396-0ac732d65496"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.713069 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-58966f7699-hjfh4" podStartSLOduration=8.616234718 podStartE2EDuration="42.7130375s" podCreationTimestamp="2025-10-06 13:21:21 +0000 UTC" firstStartedPulling="2025-10-06 13:21:24.304792489 +0000 UTC m=+1063.762740633" lastFinishedPulling="2025-10-06 13:21:58.401595271 +0000 UTC m=+1097.859543415" observedRunningTime="2025-10-06 13:22:03.6453844 +0000 UTC m=+1103.103332544" watchObservedRunningTime="2025-10-06 13:22:03.7130375 +0000 UTC m=+1103.170985644" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.773530 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-config-data" (OuterVolumeSpecName: "config-data") pod "3c6e4e3d-0f22-4a7d-9396-0ac732d65496" (UID: "3c6e4e3d-0f22-4a7d-9396-0ac732d65496"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.797184 4867 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.797223 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6e4e3d-0f22-4a7d-9396-0ac732d65496-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.805081 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-69d5cf7ffb-c2rgt" podStartSLOduration=32.805056306 podStartE2EDuration="32.805056306s" podCreationTimestamp="2025-10-06 13:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:03.715584479 +0000 UTC m=+1103.173532623" watchObservedRunningTime="2025-10-06 13:22:03.805056306 +0000 UTC m=+1103.263004450" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.933334 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-577bfb968d-pw7pq" event={"ID":"77541c32-3bc1-402d-aa9f-924f9b6cb37f","Type":"ContainerStarted","Data":"8fe961a303c8c5c9ae9a747ad9fdc3e8cb64fefa9af8883e831275e26699c6e1"} Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.933414 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01","Type":"ContainerStarted","Data":"7d60efae9ddcdaf9392ac756bc8e229aa25ea248fd78f6ab866b8f740e697ee1"} Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.933442 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c47455745-hd5zg"] Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.933463 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3c6e4e3d-0f22-4a7d-9396-0ac732d65496","Type":"ContainerDied","Data":"9a3ca6b45fd5802cd49c8d7039dc703071417a44f664a074dad0fadfefce27d1"} Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.933484 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f7bcb84f4-pcrvc" event={"ID":"f6218a59-1db5-4438-9fda-7781c1d4978b","Type":"ContainerStarted","Data":"2907d6d90ebdd474879af9266eaa64ceae93e8d4b8d59a82e2a647b608165118"} Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.933495 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58966f7699-hjfh4" event={"ID":"2a642d87-db23-4d12-90ac-ebdfbfe00996","Type":"ContainerStarted","Data":"60499b97d5c35d9b94dd6e0ffb01e9156330c4d6a9e670fc2b52d63436d243f7"} Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.933505 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d854b81e-d1da-4eb4-9291-0a98cf04d652","Type":"ContainerDied","Data":"ac66e97d55a899d9e3657c35fb81f0c9746684722765ce59553220a3abf44b51"} Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.933519 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-69d5cf7ffb-c2rgt" event={"ID":"d7e92d5c-74ed-47bc-995a-d3712014f109","Type":"ContainerStarted","Data":"59cf42d8811e2298df592a7e6e338b41049f78cc0a1ae78ae9de7961bc66f4a3"} Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.934101 4867 scope.go:117] "RemoveContainer" containerID="44b058bbf0fc727a5870ef13185982a68418e5e0c1576ec80e20359f8ab43f5f" Oct 06 13:22:03 crc kubenswrapper[4867]: I1006 13:22:03.998325 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.015752 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.031882 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.038324 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.084897 4867 scope.go:117] "RemoveContainer" containerID="ec52b5e3239eb2de61e0a540744cd800a134afc14fcf17c1339d502e8a860797" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.101385 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:22:04 crc kubenswrapper[4867]: E1006 13:22:04.106608 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c6e4e3d-0f22-4a7d-9396-0ac732d65496" containerName="glance-log" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.106636 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c6e4e3d-0f22-4a7d-9396-0ac732d65496" containerName="glance-log" Oct 06 13:22:04 crc kubenswrapper[4867]: E1006 13:22:04.108040 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d854b81e-d1da-4eb4-9291-0a98cf04d652" containerName="glance-httpd" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.108053 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d854b81e-d1da-4eb4-9291-0a98cf04d652" containerName="glance-httpd" Oct 06 13:22:04 crc kubenswrapper[4867]: E1006 13:22:04.108074 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d854b81e-d1da-4eb4-9291-0a98cf04d652" containerName="glance-log" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.108081 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d854b81e-d1da-4eb4-9291-0a98cf04d652" containerName="glance-log" Oct 06 13:22:04 crc kubenswrapper[4867]: E1006 13:22:04.108087 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c6e4e3d-0f22-4a7d-9396-0ac732d65496" containerName="glance-httpd" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.108095 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c6e4e3d-0f22-4a7d-9396-0ac732d65496" containerName="glance-httpd" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.108713 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="d854b81e-d1da-4eb4-9291-0a98cf04d652" containerName="glance-log" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.108737 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c6e4e3d-0f22-4a7d-9396-0ac732d65496" containerName="glance-httpd" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.108768 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c6e4e3d-0f22-4a7d-9396-0ac732d65496" containerName="glance-log" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.108781 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="d854b81e-d1da-4eb4-9291-0a98cf04d652" containerName="glance-httpd" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.117511 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.120947 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.121135 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vgsz4" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.121855 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.122095 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.124232 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.182732 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.186184 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.190864 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.191098 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.193826 4867 scope.go:117] "RemoveContainer" containerID="f59f5ff42c1c8095b281e59032bcc653ab2360372ada4561201d9c5927099d85" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.198369 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.226735 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsqlv\" (UniqueName: \"kubernetes.io/projected/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-kube-api-access-wsqlv\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.226784 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.226833 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.226858 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-logs\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.226894 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.226931 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.226973 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.226997 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.244241 4867 scope.go:117] "RemoveContainer" containerID="e971769bba688411b067a0f78eb44b92a4867828187d9ff697ebe1ff5d98a973" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.328759 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.328828 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-logs\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.328860 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.328905 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.328929 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.328945 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f72b9a47-8845-4870-9e50-d15fa17e2db4-logs\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.328980 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f72b9a47-8845-4870-9e50-d15fa17e2db4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.328997 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tlnv\" (UniqueName: \"kubernetes.io/projected/f72b9a47-8845-4870-9e50-d15fa17e2db4-kube-api-access-2tlnv\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.329014 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.329038 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.329060 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-scripts\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.329094 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.329112 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.329127 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-config-data\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.329172 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsqlv\" (UniqueName: \"kubernetes.io/projected/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-kube-api-access-wsqlv\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.329191 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.333308 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.333723 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.342535 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-logs\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.342857 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.351101 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.352061 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.382669 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.383373 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsqlv\" (UniqueName: \"kubernetes.io/projected/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-kube-api-access-wsqlv\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.430501 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-config-data\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.431106 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.431242 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.431345 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f72b9a47-8845-4870-9e50-d15fa17e2db4-logs\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.431458 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f72b9a47-8845-4870-9e50-d15fa17e2db4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.431556 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tlnv\" (UniqueName: \"kubernetes.io/projected/f72b9a47-8845-4870-9e50-d15fa17e2db4-kube-api-access-2tlnv\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.431681 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.431767 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-scripts\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.432312 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f72b9a47-8845-4870-9e50-d15fa17e2db4-logs\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.436955 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f72b9a47-8845-4870-9e50-d15fa17e2db4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.437958 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-config-data\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.438088 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.442682 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.447628 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-scripts\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.448170 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.458898 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.495084 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.496401 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tlnv\" (UniqueName: \"kubernetes.io/projected/f72b9a47-8845-4870-9e50-d15fa17e2db4-kube-api-access-2tlnv\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.496849 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.522513 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.778754 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c47455745-hd5zg" event={"ID":"81a1b704-8648-453e-b052-9a2721cf9830","Type":"ContainerStarted","Data":"de9656f3a0156dab93a22f59ceaf64fafdb07aae18352890eeda9de0e9b5bc82"} Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.778808 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c47455745-hd5zg" event={"ID":"81a1b704-8648-453e-b052-9a2721cf9830","Type":"ContainerStarted","Data":"3f326228f237fff49ff38ae15ecc9ead5096418cca475240e03f150e4d2355c2"} Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.781895 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f7bcb84f4-pcrvc" event={"ID":"f6218a59-1db5-4438-9fda-7781c1d4978b","Type":"ContainerStarted","Data":"6ebdeedb331438c0c01a083fb0c28d23295c8aa33cc117bd9ce5f3ef209832d0"} Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.784002 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.845852 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dbf54f95-fr228" event={"ID":"a4334a9a-d6f0-418a-958e-755336a58527","Type":"ContainerStarted","Data":"3adaf03bbf319a863a4fe29ba6bbb6df6f0f41e10bbc9bae81ee9b45129f3e32"} Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.846655 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.888569 4867 generic.go:334] "Generic (PLEG): container finished" podID="4f59ab79-d706-4e1f-9361-6efea6b85568" containerID="bf267a34fc01a7e04ddf7818b14c07aaa19939c20716b1b7141e26d95cb6957f" exitCode=0 Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.888900 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3" containerName="watcher-decision-engine" containerID="cri-o://469cdde25e5de2f983012628bf0abd04ecd346dda8914428572b0c7bc1a08662" gracePeriod=30 Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.889326 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d8hp7" event={"ID":"4f59ab79-d706-4e1f-9361-6efea6b85568","Type":"ContainerDied","Data":"bf267a34fc01a7e04ddf7818b14c07aaa19939c20716b1b7141e26d95cb6957f"} Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.890041 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="78044486-0042-4b06-ae63-0dbfe9b873ce" containerName="watcher-applier" containerID="cri-o://2b4f928591147b2ce108497d46ac17448a1adb7c077dee072960152f6d4f56b7" gracePeriod=30 Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.946211 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86dbf54f95-fr228" podStartSLOduration=6.946179238 podStartE2EDuration="6.946179238s" podCreationTimestamp="2025-10-06 13:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:04.9078781 +0000 UTC m=+1104.365826244" watchObservedRunningTime="2025-10-06 13:22:04.946179238 +0000 UTC m=+1104.404127382" Oct 06 13:22:04 crc kubenswrapper[4867]: I1006 13:22:04.972344 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5f7bcb84f4-pcrvc" podStartSLOduration=6.972314902 podStartE2EDuration="6.972314902s" podCreationTimestamp="2025-10-06 13:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:04.872842882 +0000 UTC m=+1104.330791026" watchObservedRunningTime="2025-10-06 13:22:04.972314902 +0000 UTC m=+1104.430263046" Oct 06 13:22:05 crc kubenswrapper[4867]: I1006 13:22:05.193385 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:22:05 crc kubenswrapper[4867]: I1006 13:22:05.244409 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c6e4e3d-0f22-4a7d-9396-0ac732d65496" path="/var/lib/kubelet/pods/3c6e4e3d-0f22-4a7d-9396-0ac732d65496/volumes" Oct 06 13:22:05 crc kubenswrapper[4867]: I1006 13:22:05.245518 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d854b81e-d1da-4eb4-9291-0a98cf04d652" path="/var/lib/kubelet/pods/d854b81e-d1da-4eb4-9291-0a98cf04d652/volumes" Oct 06 13:22:05 crc kubenswrapper[4867]: I1006 13:22:05.484542 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:22:05 crc kubenswrapper[4867]: I1006 13:22:05.604624 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:22:05 crc kubenswrapper[4867]: W1006 13:22:05.630436 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf72b9a47_8845_4870_9e50_d15fa17e2db4.slice/crio-2b78342651932d1ae0b7393d057d9ea393c0e470c05b145d014c30963bbd6a93 WatchSource:0}: Error finding container 2b78342651932d1ae0b7393d057d9ea393c0e470c05b145d014c30963bbd6a93: Status 404 returned error can't find the container with id 2b78342651932d1ae0b7393d057d9ea393c0e470c05b145d014c30963bbd6a93 Oct 06 13:22:05 crc kubenswrapper[4867]: I1006 13:22:05.898434 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 06 13:22:05 crc kubenswrapper[4867]: I1006 13:22:05.944836 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c47455745-hd5zg" event={"ID":"81a1b704-8648-453e-b052-9a2721cf9830","Type":"ContainerStarted","Data":"8afaa2fbc4f291458d23b7d4832dca5faaf8e27dd581d63f3231aa6bf0b739ff"} Oct 06 13:22:05 crc kubenswrapper[4867]: I1006 13:22:05.946318 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:05 crc kubenswrapper[4867]: I1006 13:22:05.957468 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae","Type":"ContainerStarted","Data":"fe510d92f245fcfa5be83f61e662a0fd41864204a507f146e1990cea1622ea37"} Oct 06 13:22:05 crc kubenswrapper[4867]: I1006 13:22:05.979222 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f72b9a47-8845-4870-9e50-d15fa17e2db4","Type":"ContainerStarted","Data":"2b78342651932d1ae0b7393d057d9ea393c0e470c05b145d014c30963bbd6a93"} Oct 06 13:22:05 crc kubenswrapper[4867]: I1006 13:22:05.980182 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-c47455745-hd5zg" podStartSLOduration=4.98015668 podStartE2EDuration="4.98015668s" podCreationTimestamp="2025-10-06 13:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:05.976639564 +0000 UTC m=+1105.434587708" watchObservedRunningTime="2025-10-06 13:22:05.98015668 +0000 UTC m=+1105.438104824" Oct 06 13:22:06 crc kubenswrapper[4867]: E1006 13:22:06.623177 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b4f928591147b2ce108497d46ac17448a1adb7c077dee072960152f6d4f56b7 is running failed: container process not found" containerID="2b4f928591147b2ce108497d46ac17448a1adb7c077dee072960152f6d4f56b7" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 06 13:22:06 crc kubenswrapper[4867]: E1006 13:22:06.626812 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b4f928591147b2ce108497d46ac17448a1adb7c077dee072960152f6d4f56b7 is running failed: container process not found" containerID="2b4f928591147b2ce108497d46ac17448a1adb7c077dee072960152f6d4f56b7" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 06 13:22:06 crc kubenswrapper[4867]: E1006 13:22:06.628516 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b4f928591147b2ce108497d46ac17448a1adb7c077dee072960152f6d4f56b7 is running failed: container process not found" containerID="2b4f928591147b2ce108497d46ac17448a1adb7c077dee072960152f6d4f56b7" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 06 13:22:06 crc kubenswrapper[4867]: E1006 13:22:06.628716 4867 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b4f928591147b2ce108497d46ac17448a1adb7c077dee072960152f6d4f56b7 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="78044486-0042-4b06-ae63-0dbfe9b873ce" containerName="watcher-applier" Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.639388 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d8hp7" Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.723401 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjn7w\" (UniqueName: \"kubernetes.io/projected/4f59ab79-d706-4e1f-9361-6efea6b85568-kube-api-access-mjn7w\") pod \"4f59ab79-d706-4e1f-9361-6efea6b85568\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.723472 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-combined-ca-bundle\") pod \"4f59ab79-d706-4e1f-9361-6efea6b85568\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.723598 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-scripts\") pod \"4f59ab79-d706-4e1f-9361-6efea6b85568\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.723954 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f59ab79-d706-4e1f-9361-6efea6b85568-logs\") pod \"4f59ab79-d706-4e1f-9361-6efea6b85568\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.724026 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-config-data\") pod \"4f59ab79-d706-4e1f-9361-6efea6b85568\" (UID: \"4f59ab79-d706-4e1f-9361-6efea6b85568\") " Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.729982 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f59ab79-d706-4e1f-9361-6efea6b85568-logs" (OuterVolumeSpecName: "logs") pod "4f59ab79-d706-4e1f-9361-6efea6b85568" (UID: "4f59ab79-d706-4e1f-9361-6efea6b85568"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.758550 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-scripts" (OuterVolumeSpecName: "scripts") pod "4f59ab79-d706-4e1f-9361-6efea6b85568" (UID: "4f59ab79-d706-4e1f-9361-6efea6b85568"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.783587 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f59ab79-d706-4e1f-9361-6efea6b85568-kube-api-access-mjn7w" (OuterVolumeSpecName: "kube-api-access-mjn7w") pod "4f59ab79-d706-4e1f-9361-6efea6b85568" (UID: "4f59ab79-d706-4e1f-9361-6efea6b85568"). InnerVolumeSpecName "kube-api-access-mjn7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.820072 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-config-data" (OuterVolumeSpecName: "config-data") pod "4f59ab79-d706-4e1f-9361-6efea6b85568" (UID: "4f59ab79-d706-4e1f-9361-6efea6b85568"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.828280 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjn7w\" (UniqueName: \"kubernetes.io/projected/4f59ab79-d706-4e1f-9361-6efea6b85568-kube-api-access-mjn7w\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.828317 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.828326 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f59ab79-d706-4e1f-9361-6efea6b85568-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.828338 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.872431 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f59ab79-d706-4e1f-9361-6efea6b85568" (UID: "4f59ab79-d706-4e1f-9361-6efea6b85568"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:06 crc kubenswrapper[4867]: I1006 13:22:06.940105 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f59ab79-d706-4e1f-9361-6efea6b85568-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.056239 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f72b9a47-8845-4870-9e50-d15fa17e2db4","Type":"ContainerStarted","Data":"da6dcfbcb5bf5b7cc561c00c394063c697c4ea8db4cb00b224323ef2bbe369c9"} Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.065523 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.077060 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d8hp7" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.077788 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d8hp7" event={"ID":"4f59ab79-d706-4e1f-9361-6efea6b85568","Type":"ContainerDied","Data":"8f30944820092d0b9d03142e9535f82eb095ea1e0030bfd3c486f5a60b9a66d3"} Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.077877 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f30944820092d0b9d03142e9535f82eb095ea1e0030bfd3c486f5a60b9a66d3" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.126395 4867 generic.go:334] "Generic (PLEG): container finished" podID="3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3" containerID="469cdde25e5de2f983012628bf0abd04ecd346dda8914428572b0c7bc1a08662" exitCode=1 Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.126493 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3","Type":"ContainerDied","Data":"469cdde25e5de2f983012628bf0abd04ecd346dda8914428572b0c7bc1a08662"} Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.157853 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78044486-0042-4b06-ae63-0dbfe9b873ce-combined-ca-bundle\") pod \"78044486-0042-4b06-ae63-0dbfe9b873ce\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.157930 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78044486-0042-4b06-ae63-0dbfe9b873ce-config-data\") pod \"78044486-0042-4b06-ae63-0dbfe9b873ce\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.158205 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv522\" (UniqueName: \"kubernetes.io/projected/78044486-0042-4b06-ae63-0dbfe9b873ce-kube-api-access-vv522\") pod \"78044486-0042-4b06-ae63-0dbfe9b873ce\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.158230 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78044486-0042-4b06-ae63-0dbfe9b873ce-logs\") pod \"78044486-0042-4b06-ae63-0dbfe9b873ce\" (UID: \"78044486-0042-4b06-ae63-0dbfe9b873ce\") " Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.159528 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78044486-0042-4b06-ae63-0dbfe9b873ce-logs" (OuterVolumeSpecName: "logs") pod "78044486-0042-4b06-ae63-0dbfe9b873ce" (UID: "78044486-0042-4b06-ae63-0dbfe9b873ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.176816 4867 generic.go:334] "Generic (PLEG): container finished" podID="78044486-0042-4b06-ae63-0dbfe9b873ce" containerID="2b4f928591147b2ce108497d46ac17448a1adb7c077dee072960152f6d4f56b7" exitCode=0 Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.177149 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78044486-0042-4b06-ae63-0dbfe9b873ce-kube-api-access-vv522" (OuterVolumeSpecName: "kube-api-access-vv522") pod "78044486-0042-4b06-ae63-0dbfe9b873ce" (UID: "78044486-0042-4b06-ae63-0dbfe9b873ce"). InnerVolumeSpecName "kube-api-access-vv522". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.177269 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"78044486-0042-4b06-ae63-0dbfe9b873ce","Type":"ContainerDied","Data":"2b4f928591147b2ce108497d46ac17448a1adb7c077dee072960152f6d4f56b7"} Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.177326 4867 scope.go:117] "RemoveContainer" containerID="2b4f928591147b2ce108497d46ac17448a1adb7c077dee072960152f6d4f56b7" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.177575 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.200387 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-594954fbc6-c2fc2"] Oct 06 13:22:07 crc kubenswrapper[4867]: E1006 13:22:07.201051 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f59ab79-d706-4e1f-9361-6efea6b85568" containerName="placement-db-sync" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.201073 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f59ab79-d706-4e1f-9361-6efea6b85568" containerName="placement-db-sync" Oct 06 13:22:07 crc kubenswrapper[4867]: E1006 13:22:07.201086 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78044486-0042-4b06-ae63-0dbfe9b873ce" containerName="watcher-applier" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.201093 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="78044486-0042-4b06-ae63-0dbfe9b873ce" containerName="watcher-applier" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.201360 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="78044486-0042-4b06-ae63-0dbfe9b873ce" containerName="watcher-applier" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.201386 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f59ab79-d706-4e1f-9361-6efea6b85568" containerName="placement-db-sync" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.202955 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.206864 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.207212 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.209603 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.209745 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.210487 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-62mxc" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.216323 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-594954fbc6-c2fc2"] Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.233662 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78044486-0042-4b06-ae63-0dbfe9b873ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78044486-0042-4b06-ae63-0dbfe9b873ce" (UID: "78044486-0042-4b06-ae63-0dbfe9b873ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.261314 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv522\" (UniqueName: \"kubernetes.io/projected/78044486-0042-4b06-ae63-0dbfe9b873ce-kube-api-access-vv522\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.261343 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78044486-0042-4b06-ae63-0dbfe9b873ce-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.261353 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78044486-0042-4b06-ae63-0dbfe9b873ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.329703 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78044486-0042-4b06-ae63-0dbfe9b873ce-config-data" (OuterVolumeSpecName: "config-data") pod "78044486-0042-4b06-ae63-0dbfe9b873ce" (UID: "78044486-0042-4b06-ae63-0dbfe9b873ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.362755 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-public-tls-certs\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.362831 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-combined-ca-bundle\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.362876 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-internal-tls-certs\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.362895 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-scripts\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.362957 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-config-data\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.363007 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ddb2a04-2d3f-4340-a512-8921427ba510-logs\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.363029 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rscf8\" (UniqueName: \"kubernetes.io/projected/7ddb2a04-2d3f-4340-a512-8921427ba510-kube-api-access-rscf8\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.363072 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78044486-0042-4b06-ae63-0dbfe9b873ce-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.391784 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.391825 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.391907 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.419793 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.465243 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-combined-ca-bundle\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.465336 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-internal-tls-certs\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.465372 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-scripts\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.465444 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-config-data\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.465501 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ddb2a04-2d3f-4340-a512-8921427ba510-logs\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.465525 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rscf8\" (UniqueName: \"kubernetes.io/projected/7ddb2a04-2d3f-4340-a512-8921427ba510-kube-api-access-rscf8\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.465553 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-public-tls-certs\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.472901 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-scripts\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.475731 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ddb2a04-2d3f-4340-a512-8921427ba510-logs\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.495178 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-combined-ca-bundle\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.495700 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-public-tls-certs\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.500229 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-config-data\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.502810 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ddb2a04-2d3f-4340-a512-8921427ba510-internal-tls-certs\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.504927 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rscf8\" (UniqueName: \"kubernetes.io/projected/7ddb2a04-2d3f-4340-a512-8921427ba510-kube-api-access-rscf8\") pod \"placement-594954fbc6-c2fc2\" (UID: \"7ddb2a04-2d3f-4340-a512-8921427ba510\") " pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.535783 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.670894 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.718054 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.739356 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.774199 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 06 13:22:07 crc kubenswrapper[4867]: E1006 13:22:07.774772 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3" containerName="watcher-decision-engine" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.774793 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3" containerName="watcher-decision-engine" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.774993 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3" containerName="watcher-decision-engine" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.775805 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.781615 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.813613 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.885083 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-logs\") pod \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.885210 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-config-data\") pod \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.885282 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-combined-ca-bundle\") pod \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.885319 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-custom-prometheus-ca\") pod \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.885340 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb5c8\" (UniqueName: \"kubernetes.io/projected/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-kube-api-access-hb5c8\") pod \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\" (UID: \"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3\") " Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.885717 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5389fa15-6fc4-4154-9760-38f0653cb802-logs\") pod \"watcher-applier-0\" (UID: \"5389fa15-6fc4-4154-9760-38f0653cb802\") " pod="openstack/watcher-applier-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.885757 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5389fa15-6fc4-4154-9760-38f0653cb802-config-data\") pod \"watcher-applier-0\" (UID: \"5389fa15-6fc4-4154-9760-38f0653cb802\") " pod="openstack/watcher-applier-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.885791 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl2zd\" (UniqueName: \"kubernetes.io/projected/5389fa15-6fc4-4154-9760-38f0653cb802-kube-api-access-tl2zd\") pod \"watcher-applier-0\" (UID: \"5389fa15-6fc4-4154-9760-38f0653cb802\") " pod="openstack/watcher-applier-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.885848 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5389fa15-6fc4-4154-9760-38f0653cb802-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"5389fa15-6fc4-4154-9760-38f0653cb802\") " pod="openstack/watcher-applier-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.897880 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-logs" (OuterVolumeSpecName: "logs") pod "3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3" (UID: "3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.917470 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-kube-api-access-hb5c8" (OuterVolumeSpecName: "kube-api-access-hb5c8") pod "3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3" (UID: "3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3"). InnerVolumeSpecName "kube-api-access-hb5c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.980395 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3" (UID: "3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.989617 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5389fa15-6fc4-4154-9760-38f0653cb802-logs\") pod \"watcher-applier-0\" (UID: \"5389fa15-6fc4-4154-9760-38f0653cb802\") " pod="openstack/watcher-applier-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.989675 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5389fa15-6fc4-4154-9760-38f0653cb802-config-data\") pod \"watcher-applier-0\" (UID: \"5389fa15-6fc4-4154-9760-38f0653cb802\") " pod="openstack/watcher-applier-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.989727 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl2zd\" (UniqueName: \"kubernetes.io/projected/5389fa15-6fc4-4154-9760-38f0653cb802-kube-api-access-tl2zd\") pod \"watcher-applier-0\" (UID: \"5389fa15-6fc4-4154-9760-38f0653cb802\") " pod="openstack/watcher-applier-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.989791 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5389fa15-6fc4-4154-9760-38f0653cb802-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"5389fa15-6fc4-4154-9760-38f0653cb802\") " pod="openstack/watcher-applier-0" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.989857 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.989867 4867 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.989877 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb5c8\" (UniqueName: \"kubernetes.io/projected/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-kube-api-access-hb5c8\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:07 crc kubenswrapper[4867]: I1006 13:22:07.990953 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5389fa15-6fc4-4154-9760-38f0653cb802-logs\") pod \"watcher-applier-0\" (UID: \"5389fa15-6fc4-4154-9760-38f0653cb802\") " pod="openstack/watcher-applier-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.012489 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5389fa15-6fc4-4154-9760-38f0653cb802-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"5389fa15-6fc4-4154-9760-38f0653cb802\") " pod="openstack/watcher-applier-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.022157 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5389fa15-6fc4-4154-9760-38f0653cb802-config-data\") pod \"watcher-applier-0\" (UID: \"5389fa15-6fc4-4154-9760-38f0653cb802\") " pod="openstack/watcher-applier-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.053441 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3" (UID: "3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.054467 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-config-data" (OuterVolumeSpecName: "config-data") pod "3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3" (UID: "3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.057486 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl2zd\" (UniqueName: \"kubernetes.io/projected/5389fa15-6fc4-4154-9760-38f0653cb802-kube-api-access-tl2zd\") pod \"watcher-applier-0\" (UID: \"5389fa15-6fc4-4154-9760-38f0653cb802\") " pod="openstack/watcher-applier-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.092977 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.093009 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.152503 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.336621 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae","Type":"ContainerStarted","Data":"5a2464d342dc52933b419004a8adfd30b17e43f6e98925653f12539ac7441d90"} Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.339021 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.351879 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3","Type":"ContainerDied","Data":"b1a78f23360ebcd4d067b2c22ea82d3d1b72a97a2a34eb3d8776e13c069d394e"} Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.351968 4867 scope.go:117] "RemoveContainer" containerID="469cdde25e5de2f983012628bf0abd04ecd346dda8914428572b0c7bc1a08662" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.478534 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.487066 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.532667 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.534511 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.551542 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.554764 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.584394 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.736338 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54babca-19bd-4a0b-a320-359b744ed066-logs\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.736439 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.736457 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.765382 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-594954fbc6-c2fc2"] Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.765666 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gspp7\" (UniqueName: \"kubernetes.io/projected/c54babca-19bd-4a0b-a320-359b744ed066-kube-api-access-gspp7\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.765812 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-config-data\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.777540 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9fbf94969-mfmxl"] Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.779244 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" podUID="267fe65a-faaf-40f1-9a41-d44776aa6b53" containerName="dnsmasq-dns" containerID="cri-o://6b42b29e4d38d08e45440802df8a1e808c08264f146b225e8f1af53eb6d15221" gracePeriod=10 Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.870453 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54babca-19bd-4a0b-a320-359b744ed066-logs\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.870880 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.870902 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.870982 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gspp7\" (UniqueName: \"kubernetes.io/projected/c54babca-19bd-4a0b-a320-359b744ed066-kube-api-access-gspp7\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.871029 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-config-data\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.873165 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54babca-19bd-4a0b-a320-359b744ed066-logs\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.897869 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.898228 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.903239 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-config-data\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.924845 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gspp7\" (UniqueName: \"kubernetes.io/projected/c54babca-19bd-4a0b-a320-359b744ed066-kube-api-access-gspp7\") pod \"watcher-decision-engine-0\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:22:08 crc kubenswrapper[4867]: I1006 13:22:08.973003 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.201154 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.282140 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3" path="/var/lib/kubelet/pods/3b321ff2-c9c9-4eb5-80c2-b7b9b1ac3dc3/volumes" Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.284013 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78044486-0042-4b06-ae63-0dbfe9b873ce" path="/var/lib/kubelet/pods/78044486-0042-4b06-ae63-0dbfe9b873ce/volumes" Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.435661 4867 generic.go:334] "Generic (PLEG): container finished" podID="267fe65a-faaf-40f1-9a41-d44776aa6b53" containerID="6b42b29e4d38d08e45440802df8a1e808c08264f146b225e8f1af53eb6d15221" exitCode=0 Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.436126 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" event={"ID":"267fe65a-faaf-40f1-9a41-d44776aa6b53","Type":"ContainerDied","Data":"6b42b29e4d38d08e45440802df8a1e808c08264f146b225e8f1af53eb6d15221"} Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.472104 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"5389fa15-6fc4-4154-9760-38f0653cb802","Type":"ContainerStarted","Data":"257d6a680be8b26330f8c3e898aebaf54ff00d7cfd499f1b0b0a30b0e5ee51f1"} Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.533116 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae","Type":"ContainerStarted","Data":"7cf8d542f1c6872b9f598e1ee3f2987f47546194cc0de293df9beb7b4fa9711c"} Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.557664 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-594954fbc6-c2fc2" event={"ID":"7ddb2a04-2d3f-4340-a512-8921427ba510","Type":"ContainerStarted","Data":"b0ffa4fb74bf6d97cbc822b60844f47b73b3fccfcb5c3a6a0d6aedd55da4ae02"} Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.557725 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-594954fbc6-c2fc2" event={"ID":"7ddb2a04-2d3f-4340-a512-8921427ba510","Type":"ContainerStarted","Data":"4d410caad4f1ea9003d2607051b7e37bbdc55025b522f393dd9a05f52969e4d4"} Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.583871 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.583845055 podStartE2EDuration="6.583845055s" podCreationTimestamp="2025-10-06 13:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:09.564879856 +0000 UTC m=+1109.022828010" watchObservedRunningTime="2025-10-06 13:22:09.583845055 +0000 UTC m=+1109.041793199" Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.589510 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f72b9a47-8845-4870-9e50-d15fa17e2db4","Type":"ContainerStarted","Data":"95703887daf9ae67a46b2677bc3b3b2f68fc6675b5f947312336d5039a06c32b"} Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.591739 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.629486 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.629464212 podStartE2EDuration="5.629464212s" podCreationTimestamp="2025-10-06 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:09.612675173 +0000 UTC m=+1109.070623317" watchObservedRunningTime="2025-10-06 13:22:09.629464212 +0000 UTC m=+1109.087412356" Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.715855 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-config\") pod \"267fe65a-faaf-40f1-9a41-d44776aa6b53\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.715930 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-ovsdbserver-sb\") pod \"267fe65a-faaf-40f1-9a41-d44776aa6b53\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.715992 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-ovsdbserver-nb\") pod \"267fe65a-faaf-40f1-9a41-d44776aa6b53\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.716036 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-dns-svc\") pod \"267fe65a-faaf-40f1-9a41-d44776aa6b53\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.716134 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-296t8\" (UniqueName: \"kubernetes.io/projected/267fe65a-faaf-40f1-9a41-d44776aa6b53-kube-api-access-296t8\") pod \"267fe65a-faaf-40f1-9a41-d44776aa6b53\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.716195 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-dns-swift-storage-0\") pod \"267fe65a-faaf-40f1-9a41-d44776aa6b53\" (UID: \"267fe65a-faaf-40f1-9a41-d44776aa6b53\") " Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.743276 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267fe65a-faaf-40f1-9a41-d44776aa6b53-kube-api-access-296t8" (OuterVolumeSpecName: "kube-api-access-296t8") pod "267fe65a-faaf-40f1-9a41-d44776aa6b53" (UID: "267fe65a-faaf-40f1-9a41-d44776aa6b53"). InnerVolumeSpecName "kube-api-access-296t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.822156 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-296t8\" (UniqueName: \"kubernetes.io/projected/267fe65a-faaf-40f1-9a41-d44776aa6b53-kube-api-access-296t8\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:09 crc kubenswrapper[4867]: I1006 13:22:09.868344 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.123590 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "267fe65a-faaf-40f1-9a41-d44776aa6b53" (UID: "267fe65a-faaf-40f1-9a41-d44776aa6b53"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.140218 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.198641 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "267fe65a-faaf-40f1-9a41-d44776aa6b53" (UID: "267fe65a-faaf-40f1-9a41-d44776aa6b53"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.243209 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.295662 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-config" (OuterVolumeSpecName: "config") pod "267fe65a-faaf-40f1-9a41-d44776aa6b53" (UID: "267fe65a-faaf-40f1-9a41-d44776aa6b53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.346559 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.405116 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "267fe65a-faaf-40f1-9a41-d44776aa6b53" (UID: "267fe65a-faaf-40f1-9a41-d44776aa6b53"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.449367 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.507108 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "267fe65a-faaf-40f1-9a41-d44776aa6b53" (UID: "267fe65a-faaf-40f1-9a41-d44776aa6b53"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.553849 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267fe65a-faaf-40f1-9a41-d44776aa6b53-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.625322 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"5389fa15-6fc4-4154-9760-38f0653cb802","Type":"ContainerStarted","Data":"329227718d30c6a581b7159c43a8d1a1fe2a9b7d7562f94db4a25fdb35802448"} Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.645071 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-594954fbc6-c2fc2" event={"ID":"7ddb2a04-2d3f-4340-a512-8921427ba510","Type":"ContainerStarted","Data":"162eb9a7ff6c2d25d943cf2e6d5ac725b1ef090213962ad8be15cda0a8df9374"} Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.646537 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.646574 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.665736 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=3.665712215 podStartE2EDuration="3.665712215s" podCreationTimestamp="2025-10-06 13:22:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:10.65564813 +0000 UTC m=+1110.113596274" watchObservedRunningTime="2025-10-06 13:22:10.665712215 +0000 UTC m=+1110.123660359" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.666745 4867 generic.go:334] "Generic (PLEG): container finished" podID="9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255" containerID="b156a56991af149be07f68a2d46a2d3973ab6eb4e2e2b1fb6f24753c0d2e173a" exitCode=0 Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.666850 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-plcg5" event={"ID":"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255","Type":"ContainerDied","Data":"b156a56991af149be07f68a2d46a2d3973ab6eb4e2e2b1fb6f24753c0d2e173a"} Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.681746 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" event={"ID":"267fe65a-faaf-40f1-9a41-d44776aa6b53","Type":"ContainerDied","Data":"403a12e88b9bf4fba69f34e224d9b6b92c8044a59932060ce54a7a1b78a3b0f0"} Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.681810 4867 scope.go:117] "RemoveContainer" containerID="6b42b29e4d38d08e45440802df8a1e808c08264f146b225e8f1af53eb6d15221" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.681976 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fbf94969-mfmxl" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.698157 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-594954fbc6-c2fc2" podStartSLOduration=3.698137352 podStartE2EDuration="3.698137352s" podCreationTimestamp="2025-10-06 13:22:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:10.685818105 +0000 UTC m=+1110.143766239" watchObservedRunningTime="2025-10-06 13:22:10.698137352 +0000 UTC m=+1110.156085496" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.708483 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c54babca-19bd-4a0b-a320-359b744ed066","Type":"ContainerStarted","Data":"e9e0d6952a07d7ae3820dd4215e92d3944b9e4d2ff2f8669c94d7baeaa6597b7"} Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.789326 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9fbf94969-mfmxl"] Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.820915 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9fbf94969-mfmxl"] Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.834687 4867 scope.go:117] "RemoveContainer" containerID="30a337ee23fafb35887833b3942c84a4940ddf9d4206f2139fdb778c8e93ee4c" Oct 06 13:22:10 crc kubenswrapper[4867]: I1006 13:22:10.941609 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.167:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 13:22:11 crc kubenswrapper[4867]: I1006 13:22:11.340021 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="267fe65a-faaf-40f1-9a41-d44776aa6b53" path="/var/lib/kubelet/pods/267fe65a-faaf-40f1-9a41-d44776aa6b53/volumes" Oct 06 13:22:11 crc kubenswrapper[4867]: I1006 13:22:11.748347 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c54babca-19bd-4a0b-a320-359b744ed066","Type":"ContainerStarted","Data":"0f802edb63fdf8dca1884c34456bdfaae9343ba7d28f568c77ce301a24ce535d"} Oct 06 13:22:11 crc kubenswrapper[4867]: I1006 13:22:11.761932 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m8h8z" event={"ID":"69ef6013-a982-45c6-8fc8-46c11fead4a7","Type":"ContainerStarted","Data":"aa1955bd06810fbd441cf73abe03679e07d67298e6bb63ad9ce5865b8cd5d85d"} Oct 06 13:22:11 crc kubenswrapper[4867]: I1006 13:22:11.775041 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=3.775023007 podStartE2EDuration="3.775023007s" podCreationTimestamp="2025-10-06 13:22:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:11.77039319 +0000 UTC m=+1111.228341334" watchObservedRunningTime="2025-10-06 13:22:11.775023007 +0000 UTC m=+1111.232971151" Oct 06 13:22:11 crc kubenswrapper[4867]: I1006 13:22:11.823435 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-m8h8z" podStartSLOduration=5.815330387 podStartE2EDuration="49.8234128s" podCreationTimestamp="2025-10-06 13:21:22 +0000 UTC" firstStartedPulling="2025-10-06 13:21:25.394785213 +0000 UTC m=+1064.852733357" lastFinishedPulling="2025-10-06 13:22:09.402867636 +0000 UTC m=+1108.860815770" observedRunningTime="2025-10-06 13:22:11.790745837 +0000 UTC m=+1111.248693981" watchObservedRunningTime="2025-10-06 13:22:11.8234128 +0000 UTC m=+1111.281360944" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.118531 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.119287 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.314415 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.326235 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.326494 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.463427 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.577095 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-scripts\") pod \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.577211 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-credential-keys\") pod \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.577294 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-config-data\") pod \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.577341 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7wsb\" (UniqueName: \"kubernetes.io/projected/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-kube-api-access-b7wsb\") pod \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.577395 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-fernet-keys\") pod \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.577486 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-combined-ca-bundle\") pod \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\" (UID: \"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255\") " Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.607659 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-scripts" (OuterVolumeSpecName: "scripts") pod "9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255" (UID: "9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.615117 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255" (UID: "9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.615589 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-kube-api-access-b7wsb" (OuterVolumeSpecName: "kube-api-access-b7wsb") pod "9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255" (UID: "9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255"). InnerVolumeSpecName "kube-api-access-b7wsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.615884 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255" (UID: "9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.688751 4867 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.688790 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7wsb\" (UniqueName: \"kubernetes.io/projected/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-kube-api-access-b7wsb\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.688802 4867 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.688814 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.717720 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-config-data" (OuterVolumeSpecName: "config-data") pod "9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255" (UID: "9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.759003 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255" (UID: "9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.791880 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.791923 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.854851 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-plcg5" event={"ID":"9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255","Type":"ContainerDied","Data":"1912d9b772b7f62d25ab984e675b7568bd0a83dd566168324569e067e2a2fe0d"} Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.854902 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1912d9b772b7f62d25ab984e675b7568bd0a83dd566168324569e067e2a2fe0d" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.855022 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-plcg5" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.900419 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4vzlg" event={"ID":"1ab90007-6383-4ff2-97cc-edb5d7d13d1e","Type":"ContainerStarted","Data":"cad0d3e4e6941aba5ca0359acf556a3542d67087e60b8a29f95c9045523ff4e6"} Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.907022 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-dcf7c7d6f-dz9mk"] Oct 06 13:22:12 crc kubenswrapper[4867]: E1006 13:22:12.920092 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255" containerName="keystone-bootstrap" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.921051 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255" containerName="keystone-bootstrap" Oct 06 13:22:12 crc kubenswrapper[4867]: E1006 13:22:12.921180 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267fe65a-faaf-40f1-9a41-d44776aa6b53" containerName="init" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.921238 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="267fe65a-faaf-40f1-9a41-d44776aa6b53" containerName="init" Oct 06 13:22:12 crc kubenswrapper[4867]: E1006 13:22:12.921341 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267fe65a-faaf-40f1-9a41-d44776aa6b53" containerName="dnsmasq-dns" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.921394 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="267fe65a-faaf-40f1-9a41-d44776aa6b53" containerName="dnsmasq-dns" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.921894 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="267fe65a-faaf-40f1-9a41-d44776aa6b53" containerName="dnsmasq-dns" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.922033 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255" containerName="keystone-bootstrap" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.922860 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.930751 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dcf7c7d6f-dz9mk"] Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.931207 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.931658 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.931980 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.932381 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sjvr2" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.932639 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.932844 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 13:22:12 crc kubenswrapper[4867]: I1006 13:22:12.980666 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-4vzlg" podStartSLOduration=4.468525331 podStartE2EDuration="50.980642722s" podCreationTimestamp="2025-10-06 13:21:22 +0000 UTC" firstStartedPulling="2025-10-06 13:21:25.031421047 +0000 UTC m=+1064.489369191" lastFinishedPulling="2025-10-06 13:22:11.543538438 +0000 UTC m=+1111.001486582" observedRunningTime="2025-10-06 13:22:12.954309902 +0000 UTC m=+1112.412258056" watchObservedRunningTime="2025-10-06 13:22:12.980642722 +0000 UTC m=+1112.438590866" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.014357 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-scripts\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.014767 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-public-tls-certs\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.014860 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-combined-ca-bundle\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.014990 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrfw5\" (UniqueName: \"kubernetes.io/projected/dd2a23fb-89c2-4a8c-b670-3f8330f13265-kube-api-access-rrfw5\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.015137 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-fernet-keys\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.015223 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-credential-keys\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.015356 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-config-data\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.015478 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-internal-tls-certs\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.076772 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.077044 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerName="watcher-api-log" containerID="cri-o://9cc4014c17d550097efe2207ca0ce53361ab7e1be6cc2bccd4ec76c4248e8ab8" gracePeriod=30 Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.077347 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerName="watcher-api" containerID="cri-o://d2a9f0156e720f2b5bf975b13afd79e9cfdd4b1fbf4cb328c04042ca361342fa" gracePeriod=30 Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.118221 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-public-tls-certs\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.118737 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-combined-ca-bundle\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.118791 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrfw5\" (UniqueName: \"kubernetes.io/projected/dd2a23fb-89c2-4a8c-b670-3f8330f13265-kube-api-access-rrfw5\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.118857 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-fernet-keys\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.118902 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-credential-keys\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.118944 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-config-data\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.118987 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-internal-tls-certs\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.119005 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-scripts\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.139554 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-scripts\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.140042 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-public-tls-certs\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.140042 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-combined-ca-bundle\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.140335 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-fernet-keys\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.141820 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-internal-tls-certs\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.144693 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-credential-keys\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.148668 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2a23fb-89c2-4a8c-b670-3f8330f13265-config-data\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.155350 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.169781 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrfw5\" (UniqueName: \"kubernetes.io/projected/dd2a23fb-89c2-4a8c-b670-3f8330f13265-kube-api-access-rrfw5\") pod \"keystone-dcf7c7d6f-dz9mk\" (UID: \"dd2a23fb-89c2-4a8c-b670-3f8330f13265\") " pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.273121 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.960542 4867 generic.go:334] "Generic (PLEG): container finished" podID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerID="9cc4014c17d550097efe2207ca0ce53361ab7e1be6cc2bccd4ec76c4248e8ab8" exitCode=143 Oct 06 13:22:13 crc kubenswrapper[4867]: I1006 13:22:13.960818 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b","Type":"ContainerDied","Data":"9cc4014c17d550097efe2207ca0ce53361ab7e1be6cc2bccd4ec76c4248e8ab8"} Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.303380 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dcf7c7d6f-dz9mk"] Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.464996 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.465526 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.525987 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.526457 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.526479 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.587689 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.592111 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.683279 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.984243 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dcf7c7d6f-dz9mk" event={"ID":"dd2a23fb-89c2-4a8c-b670-3f8330f13265","Type":"ContainerStarted","Data":"11544ccb127159a11b5f31a5afd3d87aa1aa36509706407f17616c97e93901f9"} Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.984716 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dcf7c7d6f-dz9mk" event={"ID":"dd2a23fb-89c2-4a8c-b670-3f8330f13265","Type":"ContainerStarted","Data":"cc79845f601514b030feb3ccfb3039f329832b35d7590ea85b60aea5ad46384c"} Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.984900 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.984929 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.984940 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 13:22:14 crc kubenswrapper[4867]: I1006 13:22:14.984948 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 13:22:15 crc kubenswrapper[4867]: I1006 13:22:15.007550 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-dcf7c7d6f-dz9mk" podStartSLOduration=3.007527033 podStartE2EDuration="3.007527033s" podCreationTimestamp="2025-10-06 13:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:15.002774703 +0000 UTC m=+1114.460722847" watchObservedRunningTime="2025-10-06 13:22:15.007527033 +0000 UTC m=+1114.465475177" Oct 06 13:22:15 crc kubenswrapper[4867]: I1006 13:22:15.999184 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:17 crc kubenswrapper[4867]: I1006 13:22:17.021270 4867 generic.go:334] "Generic (PLEG): container finished" podID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerID="d2a9f0156e720f2b5bf975b13afd79e9cfdd4b1fbf4cb328c04042ca361342fa" exitCode=0 Oct 06 13:22:17 crc kubenswrapper[4867]: I1006 13:22:17.021747 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b","Type":"ContainerDied","Data":"d2a9f0156e720f2b5bf975b13afd79e9cfdd4b1fbf4cb328c04042ca361342fa"} Oct 06 13:22:17 crc kubenswrapper[4867]: I1006 13:22:17.030155 4867 generic.go:334] "Generic (PLEG): container finished" podID="c54babca-19bd-4a0b-a320-359b744ed066" containerID="0f802edb63fdf8dca1884c34456bdfaae9343ba7d28f568c77ce301a24ce535d" exitCode=1 Oct 06 13:22:17 crc kubenswrapper[4867]: I1006 13:22:17.030241 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c54babca-19bd-4a0b-a320-359b744ed066","Type":"ContainerDied","Data":"0f802edb63fdf8dca1884c34456bdfaae9343ba7d28f568c77ce301a24ce535d"} Oct 06 13:22:17 crc kubenswrapper[4867]: I1006 13:22:17.031475 4867 scope.go:117] "RemoveContainer" containerID="0f802edb63fdf8dca1884c34456bdfaae9343ba7d28f568c77ce301a24ce535d" Oct 06 13:22:17 crc kubenswrapper[4867]: I1006 13:22:17.268956 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.167:9322/\": dial tcp 10.217.0.167:9322: connect: connection refused" Oct 06 13:22:17 crc kubenswrapper[4867]: I1006 13:22:17.269296 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9322/\": dial tcp 10.217.0.167:9322: connect: connection refused" Oct 06 13:22:18 crc kubenswrapper[4867]: I1006 13:22:18.046784 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 13:22:18 crc kubenswrapper[4867]: I1006 13:22:18.046919 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 13:22:18 crc kubenswrapper[4867]: I1006 13:22:18.153981 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 06 13:22:18 crc kubenswrapper[4867]: I1006 13:22:18.171035 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 13:22:18 crc kubenswrapper[4867]: I1006 13:22:18.171178 4867 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 13:22:18 crc kubenswrapper[4867]: I1006 13:22:18.205740 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 06 13:22:18 crc kubenswrapper[4867]: I1006 13:22:18.480061 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 13:22:18 crc kubenswrapper[4867]: I1006 13:22:18.563460 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 13:22:18 crc kubenswrapper[4867]: I1006 13:22:18.973823 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:18 crc kubenswrapper[4867]: I1006 13:22:18.973878 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:19 crc kubenswrapper[4867]: I1006 13:22:19.143958 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 06 13:22:21 crc kubenswrapper[4867]: I1006 13:22:21.077976 4867 generic.go:334] "Generic (PLEG): container finished" podID="1ab90007-6383-4ff2-97cc-edb5d7d13d1e" containerID="cad0d3e4e6941aba5ca0359acf556a3542d67087e60b8a29f95c9045523ff4e6" exitCode=0 Oct 06 13:22:21 crc kubenswrapper[4867]: I1006 13:22:21.078057 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4vzlg" event={"ID":"1ab90007-6383-4ff2-97cc-edb5d7d13d1e","Type":"ContainerDied","Data":"cad0d3e4e6941aba5ca0359acf556a3542d67087e60b8a29f95c9045523ff4e6"} Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.120240 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-577bfb968d-pw7pq" podUID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.164:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.164:8443: connect: connection refused" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.268685 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.167:9322/\": dial tcp 10.217.0.167:9322: connect: connection refused" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.268685 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9322/\": dial tcp 10.217.0.167:9322: connect: connection refused" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.348513 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-69d5cf7ffb-c2rgt" podUID="d7e92d5c-74ed-47bc-995a-d3712014f109" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.165:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.165:8443: connect: connection refused" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.628079 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4vzlg" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.710011 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5q2m\" (UniqueName: \"kubernetes.io/projected/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-kube-api-access-p5q2m\") pod \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\" (UID: \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\") " Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.710682 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-db-sync-config-data\") pod \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\" (UID: \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\") " Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.710763 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-combined-ca-bundle\") pod \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\" (UID: \"1ab90007-6383-4ff2-97cc-edb5d7d13d1e\") " Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.740275 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1ab90007-6383-4ff2-97cc-edb5d7d13d1e" (UID: "1ab90007-6383-4ff2-97cc-edb5d7d13d1e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.740474 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-kube-api-access-p5q2m" (OuterVolumeSpecName: "kube-api-access-p5q2m") pod "1ab90007-6383-4ff2-97cc-edb5d7d13d1e" (UID: "1ab90007-6383-4ff2-97cc-edb5d7d13d1e"). InnerVolumeSpecName "kube-api-access-p5q2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.749514 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ab90007-6383-4ff2-97cc-edb5d7d13d1e" (UID: "1ab90007-6383-4ff2-97cc-edb5d7d13d1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.770195 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.812860 4867 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.812897 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.812907 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5q2m\" (UniqueName: \"kubernetes.io/projected/1ab90007-6383-4ff2-97cc-edb5d7d13d1e-kube-api-access-p5q2m\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.914001 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-config-data\") pod \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.914112 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-custom-prometheus-ca\") pod \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.914163 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-combined-ca-bundle\") pod \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.914247 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vht4\" (UniqueName: \"kubernetes.io/projected/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-kube-api-access-9vht4\") pod \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.914332 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-logs\") pod \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\" (UID: \"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b\") " Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.915215 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-logs" (OuterVolumeSpecName: "logs") pod "a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" (UID: "a22fa1f9-5b8f-448b-9465-e1a8786f4b3b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.931999 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-kube-api-access-9vht4" (OuterVolumeSpecName: "kube-api-access-9vht4") pod "a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" (UID: "a22fa1f9-5b8f-448b-9465-e1a8786f4b3b"). InnerVolumeSpecName "kube-api-access-9vht4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.968066 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" (UID: "a22fa1f9-5b8f-448b-9465-e1a8786f4b3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.986313 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" (UID: "a22fa1f9-5b8f-448b-9465-e1a8786f4b3b"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:22 crc kubenswrapper[4867]: I1006 13:22:22.993681 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-config-data" (OuterVolumeSpecName: "config-data") pod "a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" (UID: "a22fa1f9-5b8f-448b-9465-e1a8786f4b3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.016635 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.016689 4867 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.016707 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.016723 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vht4\" (UniqueName: \"kubernetes.io/projected/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-kube-api-access-9vht4\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.016736 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.103023 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01","Type":"ContainerStarted","Data":"4e70f0a45181766bb351837b030274b29f568109482d647d7d45c2c7711319e4"} Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.105039 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4vzlg" event={"ID":"1ab90007-6383-4ff2-97cc-edb5d7d13d1e","Type":"ContainerDied","Data":"2e4bd65fe944303c7e779fd33caed054fbcb28d9c547fbd0835990d3419da410"} Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.105067 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e4bd65fe944303c7e779fd33caed054fbcb28d9c547fbd0835990d3419da410" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.105114 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4vzlg" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.107056 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a22fa1f9-5b8f-448b-9465-e1a8786f4b3b","Type":"ContainerDied","Data":"265e83a1b11783d3c534499e5ac8cc34db6d87b63cdf55ae18e186bb816f9c6e"} Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.107117 4867 scope.go:117] "RemoveContainer" containerID="d2a9f0156e720f2b5bf975b13afd79e9cfdd4b1fbf4cb328c04042ca361342fa" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.107288 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.110676 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c54babca-19bd-4a0b-a320-359b744ed066","Type":"ContainerStarted","Data":"2db97eecc610c3e34e2ba47ca51b7c8081ac513991aa1abd2d67d0e2841f3efe"} Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.116987 4867 generic.go:334] "Generic (PLEG): container finished" podID="69ef6013-a982-45c6-8fc8-46c11fead4a7" containerID="aa1955bd06810fbd441cf73abe03679e07d67298e6bb63ad9ce5865b8cd5d85d" exitCode=0 Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.117051 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m8h8z" event={"ID":"69ef6013-a982-45c6-8fc8-46c11fead4a7","Type":"ContainerDied","Data":"aa1955bd06810fbd441cf73abe03679e07d67298e6bb63ad9ce5865b8cd5d85d"} Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.137722 4867 scope.go:117] "RemoveContainer" containerID="9cc4014c17d550097efe2207ca0ce53361ab7e1be6cc2bccd4ec76c4248e8ab8" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.185685 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.207260 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.218208 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:22:23 crc kubenswrapper[4867]: E1006 13:22:23.218918 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerName="watcher-api-log" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.218948 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerName="watcher-api-log" Oct 06 13:22:23 crc kubenswrapper[4867]: E1006 13:22:23.218978 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerName="watcher-api" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.218987 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerName="watcher-api" Oct 06 13:22:23 crc kubenswrapper[4867]: E1006 13:22:23.219021 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab90007-6383-4ff2-97cc-edb5d7d13d1e" containerName="barbican-db-sync" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.219028 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab90007-6383-4ff2-97cc-edb5d7d13d1e" containerName="barbican-db-sync" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.219259 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab90007-6383-4ff2-97cc-edb5d7d13d1e" containerName="barbican-db-sync" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.219297 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerName="watcher-api-log" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.219315 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" containerName="watcher-api" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.220775 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.227360 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.227586 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.228299 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.258152 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a22fa1f9-5b8f-448b-9465-e1a8786f4b3b" path="/var/lib/kubelet/pods/a22fa1f9-5b8f-448b-9465-e1a8786f4b3b/volumes" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.263305 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.328447 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.328490 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.328538 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.328563 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vds4\" (UniqueName: \"kubernetes.io/projected/288a6591-36fc-453e-b41f-c0bed1da11b6-kube-api-access-9vds4\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.328591 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-config-data\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.328667 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-public-tls-certs\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.328709 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/288a6591-36fc-453e-b41f-c0bed1da11b6-logs\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.423306 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5fc4fffc87-p6rts"] Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.425309 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.429705 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-s72xt" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.429907 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.431639 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-public-tls-certs\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.431690 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/288a6591-36fc-453e-b41f-c0bed1da11b6-logs\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.431732 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.431751 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.431785 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.431812 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vds4\" (UniqueName: \"kubernetes.io/projected/288a6591-36fc-453e-b41f-c0bed1da11b6-kube-api-access-9vds4\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.431836 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-config-data\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.443652 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/288a6591-36fc-453e-b41f-c0bed1da11b6-logs\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.446563 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.451972 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-config-data\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.457795 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.458365 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.460124 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp"] Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.461877 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.462920 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.467629 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/288a6591-36fc-453e-b41f-c0bed1da11b6-public-tls-certs\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.487493 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.489229 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5fc4fffc87-p6rts"] Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.520904 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp"] Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.524057 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vds4\" (UniqueName: \"kubernetes.io/projected/288a6591-36fc-453e-b41f-c0bed1da11b6-kube-api-access-9vds4\") pod \"watcher-api-0\" (UID: \"288a6591-36fc-453e-b41f-c0bed1da11b6\") " pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.533197 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stwb8\" (UniqueName: \"kubernetes.io/projected/cdf0758d-d2e6-4660-8b3b-677c5febec8f-kube-api-access-stwb8\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.533255 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdf0758d-d2e6-4660-8b3b-677c5febec8f-config-data-custom\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.533308 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6057ffc-7d15-4097-b9d2-677fa9e69920-config-data-custom\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.533387 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8gwf\" (UniqueName: \"kubernetes.io/projected/f6057ffc-7d15-4097-b9d2-677fa9e69920-kube-api-access-d8gwf\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.533441 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf0758d-d2e6-4660-8b3b-677c5febec8f-config-data\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.533473 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6057ffc-7d15-4097-b9d2-677fa9e69920-logs\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.533496 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6057ffc-7d15-4097-b9d2-677fa9e69920-combined-ca-bundle\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.533520 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6057ffc-7d15-4097-b9d2-677fa9e69920-config-data\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.533560 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf0758d-d2e6-4660-8b3b-677c5febec8f-combined-ca-bundle\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.533605 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdf0758d-d2e6-4660-8b3b-677c5febec8f-logs\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.567761 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.649778 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8gwf\" (UniqueName: \"kubernetes.io/projected/f6057ffc-7d15-4097-b9d2-677fa9e69920-kube-api-access-d8gwf\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.649856 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf0758d-d2e6-4660-8b3b-677c5febec8f-config-data\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.649887 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6057ffc-7d15-4097-b9d2-677fa9e69920-logs\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.649907 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6057ffc-7d15-4097-b9d2-677fa9e69920-combined-ca-bundle\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.649930 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6057ffc-7d15-4097-b9d2-677fa9e69920-config-data\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.649967 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf0758d-d2e6-4660-8b3b-677c5febec8f-combined-ca-bundle\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.650006 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdf0758d-d2e6-4660-8b3b-677c5febec8f-logs\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.650033 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stwb8\" (UniqueName: \"kubernetes.io/projected/cdf0758d-d2e6-4660-8b3b-677c5febec8f-kube-api-access-stwb8\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.650049 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdf0758d-d2e6-4660-8b3b-677c5febec8f-config-data-custom\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.650074 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6057ffc-7d15-4097-b9d2-677fa9e69920-config-data-custom\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.653248 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdf0758d-d2e6-4660-8b3b-677c5febec8f-logs\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.656671 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6057ffc-7d15-4097-b9d2-677fa9e69920-logs\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.661630 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6057ffc-7d15-4097-b9d2-677fa9e69920-combined-ca-bundle\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.662583 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6057ffc-7d15-4097-b9d2-677fa9e69920-config-data\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.674975 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6057ffc-7d15-4097-b9d2-677fa9e69920-config-data-custom\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.677999 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdf0758d-d2e6-4660-8b3b-677c5febec8f-config-data-custom\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.678482 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf0758d-d2e6-4660-8b3b-677c5febec8f-combined-ca-bundle\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.678700 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf0758d-d2e6-4660-8b3b-677c5febec8f-config-data\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.681810 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8gwf\" (UniqueName: \"kubernetes.io/projected/f6057ffc-7d15-4097-b9d2-677fa9e69920-kube-api-access-d8gwf\") pod \"barbican-worker-5fc4fffc87-p6rts\" (UID: \"f6057ffc-7d15-4097-b9d2-677fa9e69920\") " pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.682336 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b984779c5-sfr2l"] Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.695071 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stwb8\" (UniqueName: \"kubernetes.io/projected/cdf0758d-d2e6-4660-8b3b-677c5febec8f-kube-api-access-stwb8\") pod \"barbican-keystone-listener-5f64dc9ddb-5ltwp\" (UID: \"cdf0758d-d2e6-4660-8b3b-677c5febec8f\") " pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.696350 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.705423 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-58bc94ddbb-kpk6w"] Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.707230 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.714106 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.745355 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b984779c5-sfr2l"] Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.769181 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58bc94ddbb-kpk6w"] Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.860579 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-ovsdbserver-nb\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.861099 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-combined-ca-bundle\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.861129 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-dns-svc\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.861149 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvctx\" (UniqueName: \"kubernetes.io/projected/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-kube-api-access-dvctx\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.861304 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-ovsdbserver-sb\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.861352 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shlzz\" (UniqueName: \"kubernetes.io/projected/97cb892b-93d3-4c28-8c32-e2abaaa80dce-kube-api-access-shlzz\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.861379 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97cb892b-93d3-4c28-8c32-e2abaaa80dce-logs\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.861408 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-config-data\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.861437 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-dns-swift-storage-0\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.863448 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-config-data-custom\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.863793 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-config\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.966101 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-config\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.966251 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-ovsdbserver-nb\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.966299 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-combined-ca-bundle\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.966326 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-dns-svc\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.966351 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvctx\" (UniqueName: \"kubernetes.io/projected/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-kube-api-access-dvctx\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.966405 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-ovsdbserver-sb\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.966442 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shlzz\" (UniqueName: \"kubernetes.io/projected/97cb892b-93d3-4c28-8c32-e2abaaa80dce-kube-api-access-shlzz\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.966468 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97cb892b-93d3-4c28-8c32-e2abaaa80dce-logs\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.966499 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-config-data\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.966530 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-dns-swift-storage-0\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.966567 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-config-data-custom\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.968427 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-config\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.969307 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-dns-svc\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.969890 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5fc4fffc87-p6rts" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.970893 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97cb892b-93d3-4c28-8c32-e2abaaa80dce-logs\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.970978 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-ovsdbserver-sb\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.972902 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-ovsdbserver-nb\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.972944 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-dns-swift-storage-0\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.977639 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-config-data\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.985658 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-config-data-custom\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.987474 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvctx\" (UniqueName: \"kubernetes.io/projected/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-kube-api-access-dvctx\") pod \"dnsmasq-dns-7b984779c5-sfr2l\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.988115 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-combined-ca-bundle\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.988388 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shlzz\" (UniqueName: \"kubernetes.io/projected/97cb892b-93d3-4c28-8c32-e2abaaa80dce-kube-api-access-shlzz\") pod \"barbican-api-58bc94ddbb-kpk6w\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:23 crc kubenswrapper[4867]: I1006 13:22:23.988481 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" Oct 06 13:22:24 crc kubenswrapper[4867]: I1006 13:22:24.033509 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:24 crc kubenswrapper[4867]: I1006 13:22:24.062682 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:24 crc kubenswrapper[4867]: I1006 13:22:24.353117 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.192475 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"288a6591-36fc-453e-b41f-c0bed1da11b6","Type":"ContainerStarted","Data":"e3779bd6c806c2c76722a8ee4ef81b268da817b19aa593cb0086467329fd0727"} Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.193206 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"288a6591-36fc-453e-b41f-c0bed1da11b6","Type":"ContainerStarted","Data":"5b723c5335c468d8e201633716cd21eb3fbe076db35693ce050511f205a86dc0"} Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.193224 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"288a6591-36fc-453e-b41f-c0bed1da11b6","Type":"ContainerStarted","Data":"9e6aa8c377b4e2fecc22554c0ffa6e03ab1e9af38438f779ff829468d34d7281"} Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.195563 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.198216 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="288a6591-36fc-453e-b41f-c0bed1da11b6" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.177:9322/\": dial tcp 10.217.0.177:9322: connect: connection refused" Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.216463 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.216446543 podStartE2EDuration="2.216446543s" podCreationTimestamp="2025-10-06 13:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:25.215936919 +0000 UTC m=+1124.673885073" watchObservedRunningTime="2025-10-06 13:22:25.216446543 +0000 UTC m=+1124.674394687" Oct 06 13:22:25 crc kubenswrapper[4867]: W1006 13:22:25.249050 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97cb892b_93d3_4c28_8c32_e2abaaa80dce.slice/crio-d47779adb6edadfcc3f1b8f986bee4acec3d885cb7d9d8b243a1d1a396f1aabe WatchSource:0}: Error finding container d47779adb6edadfcc3f1b8f986bee4acec3d885cb7d9d8b243a1d1a396f1aabe: Status 404 returned error can't find the container with id d47779adb6edadfcc3f1b8f986bee4acec3d885cb7d9d8b243a1d1a396f1aabe Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.260338 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58bc94ddbb-kpk6w"] Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.295237 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.406492 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5fc4fffc87-p6rts"] Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.445090 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-combined-ca-bundle\") pod \"69ef6013-a982-45c6-8fc8-46c11fead4a7\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.445143 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-scripts\") pod \"69ef6013-a982-45c6-8fc8-46c11fead4a7\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.445204 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-config-data\") pod \"69ef6013-a982-45c6-8fc8-46c11fead4a7\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.445242 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s5k9\" (UniqueName: \"kubernetes.io/projected/69ef6013-a982-45c6-8fc8-46c11fead4a7-kube-api-access-2s5k9\") pod \"69ef6013-a982-45c6-8fc8-46c11fead4a7\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.445447 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-db-sync-config-data\") pod \"69ef6013-a982-45c6-8fc8-46c11fead4a7\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.445500 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69ef6013-a982-45c6-8fc8-46c11fead4a7-etc-machine-id\") pod \"69ef6013-a982-45c6-8fc8-46c11fead4a7\" (UID: \"69ef6013-a982-45c6-8fc8-46c11fead4a7\") " Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.445938 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69ef6013-a982-45c6-8fc8-46c11fead4a7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "69ef6013-a982-45c6-8fc8-46c11fead4a7" (UID: "69ef6013-a982-45c6-8fc8-46c11fead4a7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.458421 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "69ef6013-a982-45c6-8fc8-46c11fead4a7" (UID: "69ef6013-a982-45c6-8fc8-46c11fead4a7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.463211 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ef6013-a982-45c6-8fc8-46c11fead4a7-kube-api-access-2s5k9" (OuterVolumeSpecName: "kube-api-access-2s5k9") pod "69ef6013-a982-45c6-8fc8-46c11fead4a7" (UID: "69ef6013-a982-45c6-8fc8-46c11fead4a7"). InnerVolumeSpecName "kube-api-access-2s5k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.475006 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b984779c5-sfr2l"] Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.475527 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-scripts" (OuterVolumeSpecName: "scripts") pod "69ef6013-a982-45c6-8fc8-46c11fead4a7" (UID: "69ef6013-a982-45c6-8fc8-46c11fead4a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.496204 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp"] Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.544484 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69ef6013-a982-45c6-8fc8-46c11fead4a7" (UID: "69ef6013-a982-45c6-8fc8-46c11fead4a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.550267 4867 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.551880 4867 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69ef6013-a982-45c6-8fc8-46c11fead4a7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.554304 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.555073 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.557496 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s5k9\" (UniqueName: \"kubernetes.io/projected/69ef6013-a982-45c6-8fc8-46c11fead4a7-kube-api-access-2s5k9\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.601788 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-config-data" (OuterVolumeSpecName: "config-data") pod "69ef6013-a982-45c6-8fc8-46c11fead4a7" (UID: "69ef6013-a982-45c6-8fc8-46c11fead4a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:25 crc kubenswrapper[4867]: I1006 13:22:25.659785 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69ef6013-a982-45c6-8fc8-46c11fead4a7-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.245406 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" event={"ID":"cdf0758d-d2e6-4660-8b3b-677c5febec8f","Type":"ContainerStarted","Data":"f88c8b9ecea899e15bd32c4cd4b692f5e2c49bb9e2bde03c95e2bf4b10d78cbe"} Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.258792 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fc4fffc87-p6rts" event={"ID":"f6057ffc-7d15-4097-b9d2-677fa9e69920","Type":"ContainerStarted","Data":"49f0850af72dd934d0a1a1224a8b8ef63563a4af78a22b1bd69498de41bf8be7"} Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.287660 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58bc94ddbb-kpk6w" event={"ID":"97cb892b-93d3-4c28-8c32-e2abaaa80dce","Type":"ContainerStarted","Data":"925011d1366643c86474669b03d4c9cd1af3c1c28b81be813f68d4b94ec7c743"} Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.287732 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58bc94ddbb-kpk6w" event={"ID":"97cb892b-93d3-4c28-8c32-e2abaaa80dce","Type":"ContainerStarted","Data":"d8758d3ca0058ce6cc97efb76387c8f23b72b5e8f190be9e2941e226b9989a2a"} Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.287743 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58bc94ddbb-kpk6w" event={"ID":"97cb892b-93d3-4c28-8c32-e2abaaa80dce","Type":"ContainerStarted","Data":"d47779adb6edadfcc3f1b8f986bee4acec3d885cb7d9d8b243a1d1a396f1aabe"} Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.288621 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.288657 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.307966 4867 generic.go:334] "Generic (PLEG): container finished" podID="2c2555d5-800d-4da9-bfc6-2a2d010ddb84" containerID="b3a8fe8c3346e538c5f7e6adcda565bb774ab21fbeb61e1381e9b80f17ac1217" exitCode=0 Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.308062 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" event={"ID":"2c2555d5-800d-4da9-bfc6-2a2d010ddb84","Type":"ContainerDied","Data":"b3a8fe8c3346e538c5f7e6adcda565bb774ab21fbeb61e1381e9b80f17ac1217"} Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.308095 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" event={"ID":"2c2555d5-800d-4da9-bfc6-2a2d010ddb84","Type":"ContainerStarted","Data":"68f7c0965b422fcd8114a72e03d13139d37f606e67222aea73aa75d1b55c0d6c"} Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.335877 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-58bc94ddbb-kpk6w" podStartSLOduration=3.335854771 podStartE2EDuration="3.335854771s" podCreationTimestamp="2025-10-06 13:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:26.314418465 +0000 UTC m=+1125.772366609" watchObservedRunningTime="2025-10-06 13:22:26.335854771 +0000 UTC m=+1125.793802915" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.362788 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m8h8z" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.375435 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m8h8z" event={"ID":"69ef6013-a982-45c6-8fc8-46c11fead4a7","Type":"ContainerDied","Data":"bc723ac44659405b7e03c95fb1ff5694a7efb5e8a52d3ea3db75d587f0edf58a"} Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.375497 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc723ac44659405b7e03c95fb1ff5694a7efb5e8a52d3ea3db75d587f0edf58a" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.638277 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 13:22:26 crc kubenswrapper[4867]: E1006 13:22:26.644491 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ef6013-a982-45c6-8fc8-46c11fead4a7" containerName="cinder-db-sync" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.644541 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ef6013-a982-45c6-8fc8-46c11fead4a7" containerName="cinder-db-sync" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.644796 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ef6013-a982-45c6-8fc8-46c11fead4a7" containerName="cinder-db-sync" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.645865 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.649968 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.650257 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.650396 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-r4jxx" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.650494 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.666082 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.686222 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b984779c5-sfr2l"] Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.700949 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.701050 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-scripts\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.701092 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gc6z\" (UniqueName: \"kubernetes.io/projected/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-kube-api-access-4gc6z\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.701122 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-config-data\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.701149 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.701213 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.754141 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76747ff567-27q8x"] Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.757166 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.788654 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76747ff567-27q8x"] Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.802894 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-config\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.802943 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkxjn\" (UniqueName: \"kubernetes.io/projected/63a14623-32a3-4753-8626-4ffba880aced-kube-api-access-pkxjn\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.802985 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-scripts\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.803008 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gc6z\" (UniqueName: \"kubernetes.io/projected/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-kube-api-access-4gc6z\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.803029 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-config-data\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.803050 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.803069 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.803094 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-ovsdbserver-nb\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.803115 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-dns-swift-storage-0\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.803155 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-ovsdbserver-sb\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.803174 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-dns-svc\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.803221 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.806936 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.836602 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.837463 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.852454 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-config-data\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.855656 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-scripts\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.866597 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gc6z\" (UniqueName: \"kubernetes.io/projected/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-kube-api-access-4gc6z\") pod \"cinder-scheduler-0\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.905193 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-ovsdbserver-sb\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.905250 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-dns-svc\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.905357 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-config\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.905385 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkxjn\" (UniqueName: \"kubernetes.io/projected/63a14623-32a3-4753-8626-4ffba880aced-kube-api-access-pkxjn\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.905465 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-ovsdbserver-nb\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.905486 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-dns-swift-storage-0\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.906340 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-dns-swift-storage-0\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.907305 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-ovsdbserver-sb\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.907877 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-dns-svc\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.908447 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-config\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.909243 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-ovsdbserver-nb\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:26 crc kubenswrapper[4867]: I1006 13:22:26.948362 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkxjn\" (UniqueName: \"kubernetes.io/projected/63a14623-32a3-4753-8626-4ffba880aced-kube-api-access-pkxjn\") pod \"dnsmasq-dns-76747ff567-27q8x\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.038773 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.147238 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.178503 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.204890 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.205050 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.216006 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.345525 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1325d7b5-2352-42c2-bed0-9105dd98593d-logs\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.345688 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-scripts\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.345715 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1325d7b5-2352-42c2-bed0-9105dd98593d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.345745 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk49d\" (UniqueName: \"kubernetes.io/projected/1325d7b5-2352-42c2-bed0-9105dd98593d-kube-api-access-kk49d\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.345783 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-config-data\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.345857 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.345939 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-config-data-custom\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.448258 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-config-data-custom\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.448857 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1325d7b5-2352-42c2-bed0-9105dd98593d-logs\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.448931 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-scripts\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.448957 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1325d7b5-2352-42c2-bed0-9105dd98593d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.448975 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk49d\" (UniqueName: \"kubernetes.io/projected/1325d7b5-2352-42c2-bed0-9105dd98593d-kube-api-access-kk49d\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.449003 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-config-data\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.449039 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.451129 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1325d7b5-2352-42c2-bed0-9105dd98593d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.451634 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1325d7b5-2352-42c2-bed0-9105dd98593d-logs\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.461530 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-config-data-custom\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.463299 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-config-data\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.463379 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-scripts\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.478254 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk49d\" (UniqueName: \"kubernetes.io/projected/1325d7b5-2352-42c2-bed0-9105dd98593d-kube-api-access-kk49d\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.480291 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " pod="openstack/cinder-api-0" Oct 06 13:22:27 crc kubenswrapper[4867]: I1006 13:22:27.617900 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 13:22:28 crc kubenswrapper[4867]: I1006 13:22:28.060844 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76747ff567-27q8x"] Oct 06 13:22:28 crc kubenswrapper[4867]: I1006 13:22:28.259017 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 13:22:28 crc kubenswrapper[4867]: I1006 13:22:28.483564 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76747ff567-27q8x" event={"ID":"63a14623-32a3-4753-8626-4ffba880aced","Type":"ContainerStarted","Data":"8e9970428f72926ced79545fe2d852d7074790cafe643bab7d940652abfcf1ac"} Oct 06 13:22:28 crc kubenswrapper[4867]: I1006 13:22:28.491992 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" event={"ID":"2c2555d5-800d-4da9-bfc6-2a2d010ddb84","Type":"ContainerStarted","Data":"84358d77e92648d974573063ab69d462baf1d76454996b77d75ca1d9d7542be9"} Oct 06 13:22:28 crc kubenswrapper[4867]: I1006 13:22:28.492241 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" podUID="2c2555d5-800d-4da9-bfc6-2a2d010ddb84" containerName="dnsmasq-dns" containerID="cri-o://84358d77e92648d974573063ab69d462baf1d76454996b77d75ca1d9d7542be9" gracePeriod=10 Oct 06 13:22:28 crc kubenswrapper[4867]: I1006 13:22:28.492704 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:28 crc kubenswrapper[4867]: I1006 13:22:28.510351 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 13:22:28 crc kubenswrapper[4867]: I1006 13:22:28.525606 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" podStartSLOduration=5.525585834 podStartE2EDuration="5.525585834s" podCreationTimestamp="2025-10-06 13:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:28.517486472 +0000 UTC m=+1127.975434636" watchObservedRunningTime="2025-10-06 13:22:28.525585834 +0000 UTC m=+1127.983533978" Oct 06 13:22:28 crc kubenswrapper[4867]: I1006 13:22:28.569698 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 06 13:22:28 crc kubenswrapper[4867]: I1006 13:22:28.571986 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:22:28 crc kubenswrapper[4867]: I1006 13:22:28.974247 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:29 crc kubenswrapper[4867]: I1006 13:22:29.011073 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:29 crc kubenswrapper[4867]: I1006 13:22:29.516279 4867 generic.go:334] "Generic (PLEG): container finished" podID="c54babca-19bd-4a0b-a320-359b744ed066" containerID="2db97eecc610c3e34e2ba47ca51b7c8081ac513991aa1abd2d67d0e2841f3efe" exitCode=1 Oct 06 13:22:29 crc kubenswrapper[4867]: I1006 13:22:29.516381 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c54babca-19bd-4a0b-a320-359b744ed066","Type":"ContainerDied","Data":"2db97eecc610c3e34e2ba47ca51b7c8081ac513991aa1abd2d67d0e2841f3efe"} Oct 06 13:22:29 crc kubenswrapper[4867]: I1006 13:22:29.516422 4867 scope.go:117] "RemoveContainer" containerID="0f802edb63fdf8dca1884c34456bdfaae9343ba7d28f568c77ce301a24ce535d" Oct 06 13:22:29 crc kubenswrapper[4867]: I1006 13:22:29.517079 4867 scope.go:117] "RemoveContainer" containerID="2db97eecc610c3e34e2ba47ca51b7c8081ac513991aa1abd2d67d0e2841f3efe" Oct 06 13:22:29 crc kubenswrapper[4867]: E1006 13:22:29.517384 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(c54babca-19bd-4a0b-a320-359b744ed066)\"" pod="openstack/watcher-decision-engine-0" podUID="c54babca-19bd-4a0b-a320-359b744ed066" Oct 06 13:22:29 crc kubenswrapper[4867]: I1006 13:22:29.531667 4867 generic.go:334] "Generic (PLEG): container finished" podID="2c2555d5-800d-4da9-bfc6-2a2d010ddb84" containerID="84358d77e92648d974573063ab69d462baf1d76454996b77d75ca1d9d7542be9" exitCode=0 Oct 06 13:22:29 crc kubenswrapper[4867]: I1006 13:22:29.531736 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" event={"ID":"2c2555d5-800d-4da9-bfc6-2a2d010ddb84","Type":"ContainerDied","Data":"84358d77e92648d974573063ab69d462baf1d76454996b77d75ca1d9d7542be9"} Oct 06 13:22:30 crc kubenswrapper[4867]: I1006 13:22:30.569704 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 13:22:30 crc kubenswrapper[4867]: I1006 13:22:30.605708 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f","Type":"ContainerStarted","Data":"63ef29b625b547d9d2e0b113b98572bf47901a6b63e948b8e28239f8cd8db653"} Oct 06 13:22:30 crc kubenswrapper[4867]: I1006 13:22:30.628927 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1325d7b5-2352-42c2-bed0-9105dd98593d","Type":"ContainerStarted","Data":"2f7b75e7a9c5ef44d7d42e91b0ae0a42e854f808cc3b09e2fc24d75ce63ea1c5"} Oct 06 13:22:30 crc kubenswrapper[4867]: I1006 13:22:30.705533 4867 scope.go:117] "RemoveContainer" containerID="2db97eecc610c3e34e2ba47ca51b7c8081ac513991aa1abd2d67d0e2841f3efe" Oct 06 13:22:30 crc kubenswrapper[4867]: E1006 13:22:30.706077 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(c54babca-19bd-4a0b-a320-359b744ed066)\"" pod="openstack/watcher-decision-engine-0" podUID="c54babca-19bd-4a0b-a320-359b744ed066" Oct 06 13:22:30 crc kubenswrapper[4867]: I1006 13:22:30.718455 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76747ff567-27q8x" event={"ID":"63a14623-32a3-4753-8626-4ffba880aced","Type":"ContainerStarted","Data":"8a42f85cba4fe735d646e5460506b5a24e8d45df1f8ece06197a04fca2a9377c"} Oct 06 13:22:30 crc kubenswrapper[4867]: I1006 13:22:30.768298 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:30 crc kubenswrapper[4867]: I1006 13:22:30.868639 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-ovsdbserver-sb\") pod \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " Oct 06 13:22:30 crc kubenswrapper[4867]: I1006 13:22:30.869134 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvctx\" (UniqueName: \"kubernetes.io/projected/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-kube-api-access-dvctx\") pod \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " Oct 06 13:22:30 crc kubenswrapper[4867]: I1006 13:22:30.869169 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-dns-svc\") pod \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " Oct 06 13:22:30 crc kubenswrapper[4867]: I1006 13:22:30.869227 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-config\") pod \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " Oct 06 13:22:30 crc kubenswrapper[4867]: I1006 13:22:30.904232 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-kube-api-access-dvctx" (OuterVolumeSpecName: "kube-api-access-dvctx") pod "2c2555d5-800d-4da9-bfc6-2a2d010ddb84" (UID: "2c2555d5-800d-4da9-bfc6-2a2d010ddb84"). InnerVolumeSpecName "kube-api-access-dvctx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:30 crc kubenswrapper[4867]: I1006 13:22:30.975916 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-ovsdbserver-nb\") pod \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " Oct 06 13:22:30 crc kubenswrapper[4867]: I1006 13:22:30.976845 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-dns-swift-storage-0\") pod \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\" (UID: \"2c2555d5-800d-4da9-bfc6-2a2d010ddb84\") " Oct 06 13:22:30 crc kubenswrapper[4867]: I1006 13:22:30.977417 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvctx\" (UniqueName: \"kubernetes.io/projected/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-kube-api-access-dvctx\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.392188 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="288a6591-36fc-453e-b41f-c0bed1da11b6" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.177:9322/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.494337 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-config" (OuterVolumeSpecName: "config") pod "2c2555d5-800d-4da9-bfc6-2a2d010ddb84" (UID: "2c2555d5-800d-4da9-bfc6-2a2d010ddb84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.508345 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c2555d5-800d-4da9-bfc6-2a2d010ddb84" (UID: "2c2555d5-800d-4da9-bfc6-2a2d010ddb84"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.546978 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c2555d5-800d-4da9-bfc6-2a2d010ddb84" (UID: "2c2555d5-800d-4da9-bfc6-2a2d010ddb84"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.558786 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c2555d5-800d-4da9-bfc6-2a2d010ddb84" (UID: "2c2555d5-800d-4da9-bfc6-2a2d010ddb84"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.560057 4867 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod60344bb3-1d96-49fc-bf7b-2fe9452160d3"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod60344bb3-1d96-49fc-bf7b-2fe9452160d3] : Timed out while waiting for systemd to remove kubepods-besteffort-pod60344bb3_1d96_49fc_bf7b_2fe9452160d3.slice" Oct 06 13:22:31 crc kubenswrapper[4867]: E1006 13:22:31.560147 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod60344bb3-1d96-49fc-bf7b-2fe9452160d3] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod60344bb3-1d96-49fc-bf7b-2fe9452160d3] : Timed out while waiting for systemd to remove kubepods-besteffort-pod60344bb3_1d96_49fc_bf7b_2fe9452160d3.slice" pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" podUID="60344bb3-1d96-49fc-bf7b-2fe9452160d3" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.567877 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-86d4db6f74-khhjk"] Oct 06 13:22:31 crc kubenswrapper[4867]: E1006 13:22:31.569015 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2555d5-800d-4da9-bfc6-2a2d010ddb84" containerName="init" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.569059 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2555d5-800d-4da9-bfc6-2a2d010ddb84" containerName="init" Oct 06 13:22:31 crc kubenswrapper[4867]: E1006 13:22:31.569083 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2555d5-800d-4da9-bfc6-2a2d010ddb84" containerName="dnsmasq-dns" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.569091 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2555d5-800d-4da9-bfc6-2a2d010ddb84" containerName="dnsmasq-dns" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.569725 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2555d5-800d-4da9-bfc6-2a2d010ddb84" containerName="dnsmasq-dns" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.571435 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.571462 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86d4db6f74-khhjk"] Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.571467 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.571625 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.572492 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.572040 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.579879 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.580068 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.618218 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2c2555d5-800d-4da9-bfc6-2a2d010ddb84" (UID: "2c2555d5-800d-4da9-bfc6-2a2d010ddb84"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.675246 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-config-data-custom\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.675322 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-config-data\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.675393 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72dc3e7-d107-4153-9ce3-b092369b5d66-logs\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.675420 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrrtx\" (UniqueName: \"kubernetes.io/projected/a72dc3e7-d107-4153-9ce3-b092369b5d66-kube-api-access-mrrtx\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.675440 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-combined-ca-bundle\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.675477 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-public-tls-certs\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.675550 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-internal-tls-certs\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.675621 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c2555d5-800d-4da9-bfc6-2a2d010ddb84-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.706959 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-c47455745-hd5zg" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.784128 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-config-data-custom\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.784199 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-config-data\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.789877 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72dc3e7-d107-4153-9ce3-b092369b5d66-logs\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.789974 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrrtx\" (UniqueName: \"kubernetes.io/projected/a72dc3e7-d107-4153-9ce3-b092369b5d66-kube-api-access-mrrtx\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.790054 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-combined-ca-bundle\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.790113 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-public-tls-certs\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.792887 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-internal-tls-certs\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.796660 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a72dc3e7-d107-4153-9ce3-b092369b5d66-logs\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.798050 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-config-data\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.799180 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-internal-tls-certs\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.822999 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.823043 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" event={"ID":"2c2555d5-800d-4da9-bfc6-2a2d010ddb84","Type":"ContainerDied","Data":"68f7c0965b422fcd8114a72e03d13139d37f606e67222aea73aa75d1b55c0d6c"} Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.823087 4867 scope.go:117] "RemoveContainer" containerID="84358d77e92648d974573063ab69d462baf1d76454996b77d75ca1d9d7542be9" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.823982 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b984779c5-sfr2l" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.826388 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-combined-ca-bundle\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.827659 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-public-tls-certs\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.828223 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5f7bcb84f4-pcrvc"] Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.828526 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5f7bcb84f4-pcrvc" podUID="f6218a59-1db5-4438-9fda-7781c1d4978b" containerName="neutron-api" containerID="cri-o://2907d6d90ebdd474879af9266eaa64ceae93e8d4b8d59a82e2a647b608165118" gracePeriod=30 Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.828671 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5f7bcb84f4-pcrvc" podUID="f6218a59-1db5-4438-9fda-7781c1d4978b" containerName="neutron-httpd" containerID="cri-o://6ebdeedb331438c0c01a083fb0c28d23295c8aa33cc117bd9ce5f3ef209832d0" gracePeriod=30 Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.829379 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrrtx\" (UniqueName: \"kubernetes.io/projected/a72dc3e7-d107-4153-9ce3-b092369b5d66-kube-api-access-mrrtx\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.840365 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a72dc3e7-d107-4153-9ce3-b092369b5d66-config-data-custom\") pod \"barbican-api-86d4db6f74-khhjk\" (UID: \"a72dc3e7-d107-4153-9ce3-b092369b5d66\") " pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.847483 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" event={"ID":"cdf0758d-d2e6-4660-8b3b-677c5febec8f","Type":"ContainerStarted","Data":"3fc71d961f90bba9f95d98033c13c8bfc0b04c1a16bd029d769d34599ad0093b"} Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.885124 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fc4fffc87-p6rts" event={"ID":"f6057ffc-7d15-4097-b9d2-677fa9e69920","Type":"ContainerStarted","Data":"d809fda3cba6ee71a09c422a6aa0df2593991492b3e63dcc8527bd928e86c9e0"} Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.911591 4867 generic.go:334] "Generic (PLEG): container finished" podID="ef1f9710-07da-4226-befb-73474a496cae" containerID="071906446a1f7b1b30a9be47203da070f524eb694ef9e09aa6dfdc6449357c85" exitCode=137 Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.911629 4867 generic.go:334] "Generic (PLEG): container finished" podID="ef1f9710-07da-4226-befb-73474a496cae" containerID="55fe2cb95aabb0ee7634f115dd8c77c66b892c09e51aab6c222f89e9ef0a91db" exitCode=137 Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.911713 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d55bc49dc-hzd62" event={"ID":"ef1f9710-07da-4226-befb-73474a496cae","Type":"ContainerDied","Data":"071906446a1f7b1b30a9be47203da070f524eb694ef9e09aa6dfdc6449357c85"} Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.911746 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d55bc49dc-hzd62" event={"ID":"ef1f9710-07da-4226-befb-73474a496cae","Type":"ContainerDied","Data":"55fe2cb95aabb0ee7634f115dd8c77c66b892c09e51aab6c222f89e9ef0a91db"} Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.917537 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5fc4fffc87-p6rts" podStartSLOduration=4.018625569 podStartE2EDuration="8.917518089s" podCreationTimestamp="2025-10-06 13:22:23 +0000 UTC" firstStartedPulling="2025-10-06 13:22:25.502787302 +0000 UTC m=+1124.960735446" lastFinishedPulling="2025-10-06 13:22:30.401679822 +0000 UTC m=+1129.859627966" observedRunningTime="2025-10-06 13:22:31.91206653 +0000 UTC m=+1131.370014674" watchObservedRunningTime="2025-10-06 13:22:31.917518089 +0000 UTC m=+1131.375466233" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.919971 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.974353 4867 generic.go:334] "Generic (PLEG): container finished" podID="63a14623-32a3-4753-8626-4ffba880aced" containerID="8a42f85cba4fe735d646e5460506b5a24e8d45df1f8ece06197a04fca2a9377c" exitCode=0 Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.974827 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df64d6755-5gzc7" Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.975156 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76747ff567-27q8x" event={"ID":"63a14623-32a3-4753-8626-4ffba880aced","Type":"ContainerDied","Data":"8a42f85cba4fe735d646e5460506b5a24e8d45df1f8ece06197a04fca2a9377c"} Oct 06 13:22:31 crc kubenswrapper[4867]: I1006 13:22:31.994086 4867 scope.go:117] "RemoveContainer" containerID="b3a8fe8c3346e538c5f7e6adcda565bb774ab21fbeb61e1381e9b80f17ac1217" Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.006163 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b984779c5-sfr2l"] Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.017691 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b984779c5-sfr2l"] Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.277035 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.282651 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5df64d6755-5gzc7"] Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.301529 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5df64d6755-5gzc7"] Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.436165 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7hgh\" (UniqueName: \"kubernetes.io/projected/ef1f9710-07da-4226-befb-73474a496cae-kube-api-access-w7hgh\") pod \"ef1f9710-07da-4226-befb-73474a496cae\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.437056 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1f9710-07da-4226-befb-73474a496cae-config-data\") pod \"ef1f9710-07da-4226-befb-73474a496cae\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.437992 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef1f9710-07da-4226-befb-73474a496cae-horizon-secret-key\") pod \"ef1f9710-07da-4226-befb-73474a496cae\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.438172 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1f9710-07da-4226-befb-73474a496cae-logs\") pod \"ef1f9710-07da-4226-befb-73474a496cae\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.440037 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef1f9710-07da-4226-befb-73474a496cae-logs" (OuterVolumeSpecName: "logs") pod "ef1f9710-07da-4226-befb-73474a496cae" (UID: "ef1f9710-07da-4226-befb-73474a496cae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.440209 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef1f9710-07da-4226-befb-73474a496cae-scripts\") pod \"ef1f9710-07da-4226-befb-73474a496cae\" (UID: \"ef1f9710-07da-4226-befb-73474a496cae\") " Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.442491 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1f9710-07da-4226-befb-73474a496cae-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.447954 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1f9710-07da-4226-befb-73474a496cae-kube-api-access-w7hgh" (OuterVolumeSpecName: "kube-api-access-w7hgh") pod "ef1f9710-07da-4226-befb-73474a496cae" (UID: "ef1f9710-07da-4226-befb-73474a496cae"). InnerVolumeSpecName "kube-api-access-w7hgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.456577 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1f9710-07da-4226-befb-73474a496cae-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ef1f9710-07da-4226-befb-73474a496cae" (UID: "ef1f9710-07da-4226-befb-73474a496cae"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.485897 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1f9710-07da-4226-befb-73474a496cae-scripts" (OuterVolumeSpecName: "scripts") pod "ef1f9710-07da-4226-befb-73474a496cae" (UID: "ef1f9710-07da-4226-befb-73474a496cae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.500075 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1f9710-07da-4226-befb-73474a496cae-config-data" (OuterVolumeSpecName: "config-data") pod "ef1f9710-07da-4226-befb-73474a496cae" (UID: "ef1f9710-07da-4226-befb-73474a496cae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.544837 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7hgh\" (UniqueName: \"kubernetes.io/projected/ef1f9710-07da-4226-befb-73474a496cae-kube-api-access-w7hgh\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.544896 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1f9710-07da-4226-befb-73474a496cae-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.544910 4867 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef1f9710-07da-4226-befb-73474a496cae-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.544922 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef1f9710-07da-4226-befb-73474a496cae-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:32 crc kubenswrapper[4867]: I1006 13:22:32.709988 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86d4db6f74-khhjk"] Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.007569 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" event={"ID":"cdf0758d-d2e6-4660-8b3b-677c5febec8f","Type":"ContainerStarted","Data":"efd4f00337943ea29882ec350519e3e1f7c0dff5a67b77508cd4d1667a198808"} Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.021073 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86d4db6f74-khhjk" event={"ID":"a72dc3e7-d107-4153-9ce3-b092369b5d66","Type":"ContainerStarted","Data":"f503c3205f0a65bfc0fa3019d656126ce77793f1d5f28fbcd5a2e34e55d094f6"} Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.053417 4867 generic.go:334] "Generic (PLEG): container finished" podID="a8819e30-41b5-4fcd-8158-b6b5c178aea9" containerID="c3602b6f7c4d145e16b5b527d41d0029e81afe01378928c968b1afc7e5231e8d" exitCode=137 Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.053895 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7544c988fc-272d4" event={"ID":"a8819e30-41b5-4fcd-8158-b6b5c178aea9","Type":"ContainerDied","Data":"c3602b6f7c4d145e16b5b527d41d0029e81afe01378928c968b1afc7e5231e8d"} Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.057797 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5f64dc9ddb-5ltwp" podStartSLOduration=5.160459461 podStartE2EDuration="10.057777677s" podCreationTimestamp="2025-10-06 13:22:23 +0000 UTC" firstStartedPulling="2025-10-06 13:22:25.504856289 +0000 UTC m=+1124.962804433" lastFinishedPulling="2025-10-06 13:22:30.402174505 +0000 UTC m=+1129.860122649" observedRunningTime="2025-10-06 13:22:33.049387037 +0000 UTC m=+1132.507335181" watchObservedRunningTime="2025-10-06 13:22:33.057777677 +0000 UTC m=+1132.515725821" Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.135410 4867 generic.go:334] "Generic (PLEG): container finished" podID="f6218a59-1db5-4438-9fda-7781c1d4978b" containerID="6ebdeedb331438c0c01a083fb0c28d23295c8aa33cc117bd9ce5f3ef209832d0" exitCode=0 Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.135616 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f7bcb84f4-pcrvc" event={"ID":"f6218a59-1db5-4438-9fda-7781c1d4978b","Type":"ContainerDied","Data":"6ebdeedb331438c0c01a083fb0c28d23295c8aa33cc117bd9ce5f3ef209832d0"} Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.154464 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1325d7b5-2352-42c2-bed0-9105dd98593d","Type":"ContainerStarted","Data":"fe6a1439ea2216b133cc72e765f250d593e851fe0cbe7ac615617303978b8d47"} Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.191624 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76747ff567-27q8x" event={"ID":"63a14623-32a3-4753-8626-4ffba880aced","Type":"ContainerStarted","Data":"9efff486e50a281ec4e3c3b9964b58b5eee9274b85abc83bcdb27f19de00bc7f"} Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.192682 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.255059 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76747ff567-27q8x" podStartSLOduration=7.25502302 podStartE2EDuration="7.25502302s" podCreationTimestamp="2025-10-06 13:22:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:33.230308964 +0000 UTC m=+1132.688257108" watchObservedRunningTime="2025-10-06 13:22:33.25502302 +0000 UTC m=+1132.712971164" Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.275928 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2555d5-800d-4da9-bfc6-2a2d010ddb84" path="/var/lib/kubelet/pods/2c2555d5-800d-4da9-bfc6-2a2d010ddb84/volumes" Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.276683 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60344bb3-1d96-49fc-bf7b-2fe9452160d3" path="/var/lib/kubelet/pods/60344bb3-1d96-49fc-bf7b-2fe9452160d3/volumes" Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.277669 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fc4fffc87-p6rts" event={"ID":"f6057ffc-7d15-4097-b9d2-677fa9e69920","Type":"ContainerStarted","Data":"892dc1571d3c33c577d6fe1e69e0c44a42fd63a566b9d5e01e7999abb15d55bf"} Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.306102 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d55bc49dc-hzd62" event={"ID":"ef1f9710-07da-4226-befb-73474a496cae","Type":"ContainerDied","Data":"a3dff45e328e489cd5866aa74480bf46905b0c7ee79f693480e6ca409986c667"} Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.306182 4867 scope.go:117] "RemoveContainer" containerID="071906446a1f7b1b30a9be47203da070f524eb694ef9e09aa6dfdc6449357c85" Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.306376 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d55bc49dc-hzd62" Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.390339 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d55bc49dc-hzd62"] Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.405708 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7d55bc49dc-hzd62"] Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.569731 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.637846 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.760100 4867 scope.go:117] "RemoveContainer" containerID="55fe2cb95aabb0ee7634f115dd8c77c66b892c09e51aab6c222f89e9ef0a91db" Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.844091 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.993121 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8819e30-41b5-4fcd-8158-b6b5c178aea9-scripts\") pod \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.993706 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8819e30-41b5-4fcd-8158-b6b5c178aea9-logs\") pod \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.993735 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a8819e30-41b5-4fcd-8158-b6b5c178aea9-horizon-secret-key\") pod \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.993808 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkxpq\" (UniqueName: \"kubernetes.io/projected/a8819e30-41b5-4fcd-8158-b6b5c178aea9-kube-api-access-rkxpq\") pod \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.993889 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8819e30-41b5-4fcd-8158-b6b5c178aea9-config-data\") pod \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\" (UID: \"a8819e30-41b5-4fcd-8158-b6b5c178aea9\") " Oct 06 13:22:33 crc kubenswrapper[4867]: I1006 13:22:33.994945 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8819e30-41b5-4fcd-8158-b6b5c178aea9-logs" (OuterVolumeSpecName: "logs") pod "a8819e30-41b5-4fcd-8158-b6b5c178aea9" (UID: "a8819e30-41b5-4fcd-8158-b6b5c178aea9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.007736 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8819e30-41b5-4fcd-8158-b6b5c178aea9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a8819e30-41b5-4fcd-8158-b6b5c178aea9" (UID: "a8819e30-41b5-4fcd-8158-b6b5c178aea9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.010464 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8819e30-41b5-4fcd-8158-b6b5c178aea9-kube-api-access-rkxpq" (OuterVolumeSpecName: "kube-api-access-rkxpq") pod "a8819e30-41b5-4fcd-8158-b6b5c178aea9" (UID: "a8819e30-41b5-4fcd-8158-b6b5c178aea9"). InnerVolumeSpecName "kube-api-access-rkxpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.072781 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8819e30-41b5-4fcd-8158-b6b5c178aea9-scripts" (OuterVolumeSpecName: "scripts") pod "a8819e30-41b5-4fcd-8158-b6b5c178aea9" (UID: "a8819e30-41b5-4fcd-8158-b6b5c178aea9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.098820 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8819e30-41b5-4fcd-8158-b6b5c178aea9-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.099089 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8819e30-41b5-4fcd-8158-b6b5c178aea9-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.099194 4867 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a8819e30-41b5-4fcd-8158-b6b5c178aea9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.099299 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkxpq\" (UniqueName: \"kubernetes.io/projected/a8819e30-41b5-4fcd-8158-b6b5c178aea9-kube-api-access-rkxpq\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.178976 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8819e30-41b5-4fcd-8158-b6b5c178aea9-config-data" (OuterVolumeSpecName: "config-data") pod "a8819e30-41b5-4fcd-8158-b6b5c178aea9" (UID: "a8819e30-41b5-4fcd-8158-b6b5c178aea9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.202706 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a8819e30-41b5-4fcd-8158-b6b5c178aea9-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.431534 4867 generic.go:334] "Generic (PLEG): container finished" podID="a8819e30-41b5-4fcd-8158-b6b5c178aea9" containerID="587f0c4f50a3a37d391a63eb68d9c5222ef98e7a6671e9717157a45d35f1b8a5" exitCode=137 Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.432042 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7544c988fc-272d4" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.433508 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7544c988fc-272d4" event={"ID":"a8819e30-41b5-4fcd-8158-b6b5c178aea9","Type":"ContainerDied","Data":"587f0c4f50a3a37d391a63eb68d9c5222ef98e7a6671e9717157a45d35f1b8a5"} Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.433560 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7544c988fc-272d4" event={"ID":"a8819e30-41b5-4fcd-8158-b6b5c178aea9","Type":"ContainerDied","Data":"002c355b508e1ddbe3658075fd90c0d786e7368db465321304d15db9b491dc80"} Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.433580 4867 scope.go:117] "RemoveContainer" containerID="587f0c4f50a3a37d391a63eb68d9c5222ef98e7a6671e9717157a45d35f1b8a5" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.494669 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f","Type":"ContainerStarted","Data":"89a6aa6de775e66b0aa0b2f71bfec0c4ccd596c4612ed6ef676a3a5988aeeeb2"} Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.528950 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1325d7b5-2352-42c2-bed0-9105dd98593d","Type":"ContainerStarted","Data":"8371d8ca6dfb70ddcb4407bd0fb5bc5ccec77f0b3d9d77dc53c879193088c37b"} Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.529060 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1325d7b5-2352-42c2-bed0-9105dd98593d" containerName="cinder-api-log" containerID="cri-o://fe6a1439ea2216b133cc72e765f250d593e851fe0cbe7ac615617303978b8d47" gracePeriod=30 Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.529197 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1325d7b5-2352-42c2-bed0-9105dd98593d" containerName="cinder-api" containerID="cri-o://8371d8ca6dfb70ddcb4407bd0fb5bc5ccec77f0b3d9d77dc53c879193088c37b" gracePeriod=30 Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.529390 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.559993 4867 generic.go:334] "Generic (PLEG): container finished" podID="2a642d87-db23-4d12-90ac-ebdfbfe00996" containerID="60499b97d5c35d9b94dd6e0ffb01e9156330c4d6a9e670fc2b52d63436d243f7" exitCode=137 Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.560026 4867 generic.go:334] "Generic (PLEG): container finished" podID="2a642d87-db23-4d12-90ac-ebdfbfe00996" containerID="637ed67407d0d2cc300c14f7387d0fd2855bd868a43ad5c7bad18ba2e888ab7f" exitCode=137 Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.560078 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58966f7699-hjfh4" event={"ID":"2a642d87-db23-4d12-90ac-ebdfbfe00996","Type":"ContainerDied","Data":"60499b97d5c35d9b94dd6e0ffb01e9156330c4d6a9e670fc2b52d63436d243f7"} Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.560107 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58966f7699-hjfh4" event={"ID":"2a642d87-db23-4d12-90ac-ebdfbfe00996","Type":"ContainerDied","Data":"637ed67407d0d2cc300c14f7387d0fd2855bd868a43ad5c7bad18ba2e888ab7f"} Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.574735 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86d4db6f74-khhjk" event={"ID":"a72dc3e7-d107-4153-9ce3-b092369b5d66","Type":"ContainerStarted","Data":"8415e8500ae57d00cdd25fb185d8e54ace8cd0e16283be57c91486b6942f71a8"} Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.574820 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86d4db6f74-khhjk" event={"ID":"a72dc3e7-d107-4153-9ce3-b092369b5d66","Type":"ContainerStarted","Data":"0743c5695eeded97fbb1b159daa7891dba3b0b5ee3fa7adbad5af66bdd0a2155"} Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.575169 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.575267 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.599378 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=8.599358298 podStartE2EDuration="8.599358298s" podCreationTimestamp="2025-10-06 13:22:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:34.559736265 +0000 UTC m=+1134.017684409" watchObservedRunningTime="2025-10-06 13:22:34.599358298 +0000 UTC m=+1134.057306442" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.603532 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.604337 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.612543 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-86d4db6f74-khhjk" podStartSLOduration=3.612505368 podStartE2EDuration="3.612505368s" podCreationTimestamp="2025-10-06 13:22:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:34.594515416 +0000 UTC m=+1134.052463560" watchObservedRunningTime="2025-10-06 13:22:34.612505368 +0000 UTC m=+1134.070453512" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.723504 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a642d87-db23-4d12-90ac-ebdfbfe00996-scripts\") pod \"2a642d87-db23-4d12-90ac-ebdfbfe00996\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.723652 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a642d87-db23-4d12-90ac-ebdfbfe00996-logs\") pod \"2a642d87-db23-4d12-90ac-ebdfbfe00996\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.723678 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th9gw\" (UniqueName: \"kubernetes.io/projected/2a642d87-db23-4d12-90ac-ebdfbfe00996-kube-api-access-th9gw\") pod \"2a642d87-db23-4d12-90ac-ebdfbfe00996\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.723725 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a642d87-db23-4d12-90ac-ebdfbfe00996-config-data\") pod \"2a642d87-db23-4d12-90ac-ebdfbfe00996\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.723809 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a642d87-db23-4d12-90ac-ebdfbfe00996-horizon-secret-key\") pod \"2a642d87-db23-4d12-90ac-ebdfbfe00996\" (UID: \"2a642d87-db23-4d12-90ac-ebdfbfe00996\") " Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.726187 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a642d87-db23-4d12-90ac-ebdfbfe00996-logs" (OuterVolumeSpecName: "logs") pod "2a642d87-db23-4d12-90ac-ebdfbfe00996" (UID: "2a642d87-db23-4d12-90ac-ebdfbfe00996"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.738080 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a642d87-db23-4d12-90ac-ebdfbfe00996-kube-api-access-th9gw" (OuterVolumeSpecName: "kube-api-access-th9gw") pod "2a642d87-db23-4d12-90ac-ebdfbfe00996" (UID: "2a642d87-db23-4d12-90ac-ebdfbfe00996"). InnerVolumeSpecName "kube-api-access-th9gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.746536 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a642d87-db23-4d12-90ac-ebdfbfe00996-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2a642d87-db23-4d12-90ac-ebdfbfe00996" (UID: "2a642d87-db23-4d12-90ac-ebdfbfe00996"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.795077 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a642d87-db23-4d12-90ac-ebdfbfe00996-config-data" (OuterVolumeSpecName: "config-data") pod "2a642d87-db23-4d12-90ac-ebdfbfe00996" (UID: "2a642d87-db23-4d12-90ac-ebdfbfe00996"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.815205 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7544c988fc-272d4"] Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.825920 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a642d87-db23-4d12-90ac-ebdfbfe00996-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.825957 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th9gw\" (UniqueName: \"kubernetes.io/projected/2a642d87-db23-4d12-90ac-ebdfbfe00996-kube-api-access-th9gw\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.825968 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a642d87-db23-4d12-90ac-ebdfbfe00996-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.825979 4867 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2a642d87-db23-4d12-90ac-ebdfbfe00996-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.831562 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a642d87-db23-4d12-90ac-ebdfbfe00996-scripts" (OuterVolumeSpecName: "scripts") pod "2a642d87-db23-4d12-90ac-ebdfbfe00996" (UID: "2a642d87-db23-4d12-90ac-ebdfbfe00996"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.837899 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7544c988fc-272d4"] Oct 06 13:22:34 crc kubenswrapper[4867]: I1006 13:22:34.937679 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a642d87-db23-4d12-90ac-ebdfbfe00996-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.130330 4867 scope.go:117] "RemoveContainer" containerID="c3602b6f7c4d145e16b5b527d41d0029e81afe01378928c968b1afc7e5231e8d" Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.256070 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8819e30-41b5-4fcd-8158-b6b5c178aea9" path="/var/lib/kubelet/pods/a8819e30-41b5-4fcd-8158-b6b5c178aea9/volumes" Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.256798 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef1f9710-07da-4226-befb-73474a496cae" path="/var/lib/kubelet/pods/ef1f9710-07da-4226-befb-73474a496cae/volumes" Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.580729 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.604353 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f","Type":"ContainerStarted","Data":"cb27dfcc69fb5281dca40d4de5e84a807e649445d57d8ecdffb071e7a42e89a4"} Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.610092 4867 generic.go:334] "Generic (PLEG): container finished" podID="1325d7b5-2352-42c2-bed0-9105dd98593d" containerID="8371d8ca6dfb70ddcb4407bd0fb5bc5ccec77f0b3d9d77dc53c879193088c37b" exitCode=0 Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.610130 4867 generic.go:334] "Generic (PLEG): container finished" podID="1325d7b5-2352-42c2-bed0-9105dd98593d" containerID="fe6a1439ea2216b133cc72e765f250d593e851fe0cbe7ac615617303978b8d47" exitCode=143 Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.610223 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1325d7b5-2352-42c2-bed0-9105dd98593d","Type":"ContainerDied","Data":"8371d8ca6dfb70ddcb4407bd0fb5bc5ccec77f0b3d9d77dc53c879193088c37b"} Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.610271 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1325d7b5-2352-42c2-bed0-9105dd98593d","Type":"ContainerDied","Data":"fe6a1439ea2216b133cc72e765f250d593e851fe0cbe7ac615617303978b8d47"} Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.622290 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58966f7699-hjfh4" Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.622800 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58966f7699-hjfh4" event={"ID":"2a642d87-db23-4d12-90ac-ebdfbfe00996","Type":"ContainerDied","Data":"6aa36ef7116bc1e26c7658ff412270e82977faa327dea3d4d5fa3d36455bb37a"} Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.631440 4867 scope.go:117] "RemoveContainer" containerID="587f0c4f50a3a37d391a63eb68d9c5222ef98e7a6671e9717157a45d35f1b8a5" Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.634940 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=9.165304442 podStartE2EDuration="9.634920003s" podCreationTimestamp="2025-10-06 13:22:26 +0000 UTC" firstStartedPulling="2025-10-06 13:22:30.029329361 +0000 UTC m=+1129.487277505" lastFinishedPulling="2025-10-06 13:22:30.498944922 +0000 UTC m=+1129.956893066" observedRunningTime="2025-10-06 13:22:35.633665898 +0000 UTC m=+1135.091614042" watchObservedRunningTime="2025-10-06 13:22:35.634920003 +0000 UTC m=+1135.092868147" Oct 06 13:22:35 crc kubenswrapper[4867]: E1006 13:22:35.646114 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"587f0c4f50a3a37d391a63eb68d9c5222ef98e7a6671e9717157a45d35f1b8a5\": container with ID starting with 587f0c4f50a3a37d391a63eb68d9c5222ef98e7a6671e9717157a45d35f1b8a5 not found: ID does not exist" containerID="587f0c4f50a3a37d391a63eb68d9c5222ef98e7a6671e9717157a45d35f1b8a5" Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.646181 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"587f0c4f50a3a37d391a63eb68d9c5222ef98e7a6671e9717157a45d35f1b8a5"} err="failed to get container status \"587f0c4f50a3a37d391a63eb68d9c5222ef98e7a6671e9717157a45d35f1b8a5\": rpc error: code = NotFound desc = could not find container \"587f0c4f50a3a37d391a63eb68d9c5222ef98e7a6671e9717157a45d35f1b8a5\": container with ID starting with 587f0c4f50a3a37d391a63eb68d9c5222ef98e7a6671e9717157a45d35f1b8a5 not found: ID does not exist" Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.646213 4867 scope.go:117] "RemoveContainer" containerID="c3602b6f7c4d145e16b5b527d41d0029e81afe01378928c968b1afc7e5231e8d" Oct 06 13:22:35 crc kubenswrapper[4867]: E1006 13:22:35.648434 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3602b6f7c4d145e16b5b527d41d0029e81afe01378928c968b1afc7e5231e8d\": container with ID starting with c3602b6f7c4d145e16b5b527d41d0029e81afe01378928c968b1afc7e5231e8d not found: ID does not exist" containerID="c3602b6f7c4d145e16b5b527d41d0029e81afe01378928c968b1afc7e5231e8d" Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.648462 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3602b6f7c4d145e16b5b527d41d0029e81afe01378928c968b1afc7e5231e8d"} err="failed to get container status \"c3602b6f7c4d145e16b5b527d41d0029e81afe01378928c968b1afc7e5231e8d\": rpc error: code = NotFound desc = could not find container \"c3602b6f7c4d145e16b5b527d41d0029e81afe01378928c968b1afc7e5231e8d\": container with ID starting with c3602b6f7c4d145e16b5b527d41d0029e81afe01378928c968b1afc7e5231e8d not found: ID does not exist" Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.648477 4867 scope.go:117] "RemoveContainer" containerID="60499b97d5c35d9b94dd6e0ffb01e9156330c4d6a9e670fc2b52d63436d243f7" Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.674502 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58966f7699-hjfh4"] Oct 06 13:22:35 crc kubenswrapper[4867]: I1006 13:22:35.690783 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-58966f7699-hjfh4"] Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.183555 4867 scope.go:117] "RemoveContainer" containerID="637ed67407d0d2cc300c14f7387d0fd2855bd868a43ad5c7bad18ba2e888ab7f" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.522715 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.600781 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk49d\" (UniqueName: \"kubernetes.io/projected/1325d7b5-2352-42c2-bed0-9105dd98593d-kube-api-access-kk49d\") pod \"1325d7b5-2352-42c2-bed0-9105dd98593d\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.600886 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-config-data-custom\") pod \"1325d7b5-2352-42c2-bed0-9105dd98593d\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.600959 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-config-data\") pod \"1325d7b5-2352-42c2-bed0-9105dd98593d\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.601055 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-scripts\") pod \"1325d7b5-2352-42c2-bed0-9105dd98593d\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.601218 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1325d7b5-2352-42c2-bed0-9105dd98593d-etc-machine-id\") pod \"1325d7b5-2352-42c2-bed0-9105dd98593d\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.601297 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-combined-ca-bundle\") pod \"1325d7b5-2352-42c2-bed0-9105dd98593d\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.601343 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1325d7b5-2352-42c2-bed0-9105dd98593d-logs\") pod \"1325d7b5-2352-42c2-bed0-9105dd98593d\" (UID: \"1325d7b5-2352-42c2-bed0-9105dd98593d\") " Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.603602 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1325d7b5-2352-42c2-bed0-9105dd98593d-logs" (OuterVolumeSpecName: "logs") pod "1325d7b5-2352-42c2-bed0-9105dd98593d" (UID: "1325d7b5-2352-42c2-bed0-9105dd98593d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.604349 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1325d7b5-2352-42c2-bed0-9105dd98593d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1325d7b5-2352-42c2-bed0-9105dd98593d" (UID: "1325d7b5-2352-42c2-bed0-9105dd98593d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.624473 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-scripts" (OuterVolumeSpecName: "scripts") pod "1325d7b5-2352-42c2-bed0-9105dd98593d" (UID: "1325d7b5-2352-42c2-bed0-9105dd98593d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.625454 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1325d7b5-2352-42c2-bed0-9105dd98593d-kube-api-access-kk49d" (OuterVolumeSpecName: "kube-api-access-kk49d") pod "1325d7b5-2352-42c2-bed0-9105dd98593d" (UID: "1325d7b5-2352-42c2-bed0-9105dd98593d"). InnerVolumeSpecName "kube-api-access-kk49d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.627264 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1325d7b5-2352-42c2-bed0-9105dd98593d" (UID: "1325d7b5-2352-42c2-bed0-9105dd98593d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.703626 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.703995 4867 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1325d7b5-2352-42c2-bed0-9105dd98593d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.704007 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1325d7b5-2352-42c2-bed0-9105dd98593d-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.704017 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk49d\" (UniqueName: \"kubernetes.io/projected/1325d7b5-2352-42c2-bed0-9105dd98593d-kube-api-access-kk49d\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.704028 4867 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.716795 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1325d7b5-2352-42c2-bed0-9105dd98593d" (UID: "1325d7b5-2352-42c2-bed0-9105dd98593d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.723456 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-config-data" (OuterVolumeSpecName: "config-data") pod "1325d7b5-2352-42c2-bed0-9105dd98593d" (UID: "1325d7b5-2352-42c2-bed0-9105dd98593d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.738411 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1325d7b5-2352-42c2-bed0-9105dd98593d","Type":"ContainerDied","Data":"2f7b75e7a9c5ef44d7d42e91b0ae0a42e854f808cc3b09e2fc24d75ce63ea1c5"} Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.738485 4867 scope.go:117] "RemoveContainer" containerID="8371d8ca6dfb70ddcb4407bd0fb5bc5ccec77f0b3d9d77dc53c879193088c37b" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.738690 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.806304 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.806342 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1325d7b5-2352-42c2-bed0-9105dd98593d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.833327 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.845414 4867 scope.go:117] "RemoveContainer" containerID="fe6a1439ea2216b133cc72e765f250d593e851fe0cbe7ac615617303978b8d47" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.863322 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.874665 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 13:22:36 crc kubenswrapper[4867]: E1006 13:22:36.875521 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1f9710-07da-4226-befb-73474a496cae" containerName="horizon" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.875541 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1f9710-07da-4226-befb-73474a496cae" containerName="horizon" Oct 06 13:22:36 crc kubenswrapper[4867]: E1006 13:22:36.875562 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a642d87-db23-4d12-90ac-ebdfbfe00996" containerName="horizon" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.875584 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a642d87-db23-4d12-90ac-ebdfbfe00996" containerName="horizon" Oct 06 13:22:36 crc kubenswrapper[4867]: E1006 13:22:36.875596 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1f9710-07da-4226-befb-73474a496cae" containerName="horizon-log" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.875603 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1f9710-07da-4226-befb-73474a496cae" containerName="horizon-log" Oct 06 13:22:36 crc kubenswrapper[4867]: E1006 13:22:36.875621 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1325d7b5-2352-42c2-bed0-9105dd98593d" containerName="cinder-api-log" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.875627 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1325d7b5-2352-42c2-bed0-9105dd98593d" containerName="cinder-api-log" Oct 06 13:22:36 crc kubenswrapper[4867]: E1006 13:22:36.875669 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1325d7b5-2352-42c2-bed0-9105dd98593d" containerName="cinder-api" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.875678 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1325d7b5-2352-42c2-bed0-9105dd98593d" containerName="cinder-api" Oct 06 13:22:36 crc kubenswrapper[4867]: E1006 13:22:36.875689 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8819e30-41b5-4fcd-8158-b6b5c178aea9" containerName="horizon" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.875696 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8819e30-41b5-4fcd-8158-b6b5c178aea9" containerName="horizon" Oct 06 13:22:36 crc kubenswrapper[4867]: E1006 13:22:36.875711 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8819e30-41b5-4fcd-8158-b6b5c178aea9" containerName="horizon-log" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.875716 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8819e30-41b5-4fcd-8158-b6b5c178aea9" containerName="horizon-log" Oct 06 13:22:36 crc kubenswrapper[4867]: E1006 13:22:36.875762 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a642d87-db23-4d12-90ac-ebdfbfe00996" containerName="horizon-log" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.875769 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a642d87-db23-4d12-90ac-ebdfbfe00996" containerName="horizon-log" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.876066 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a642d87-db23-4d12-90ac-ebdfbfe00996" containerName="horizon-log" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.876089 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8819e30-41b5-4fcd-8158-b6b5c178aea9" containerName="horizon-log" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.876100 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1325d7b5-2352-42c2-bed0-9105dd98593d" containerName="cinder-api-log" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.876119 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1f9710-07da-4226-befb-73474a496cae" containerName="horizon-log" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.876149 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1325d7b5-2352-42c2-bed0-9105dd98593d" containerName="cinder-api" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.876162 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8819e30-41b5-4fcd-8158-b6b5c178aea9" containerName="horizon" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.876170 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1f9710-07da-4226-befb-73474a496cae" containerName="horizon" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.876190 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a642d87-db23-4d12-90ac-ebdfbfe00996" containerName="horizon" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.878296 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.887340 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.887578 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.887752 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.893819 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:22:36 crc kubenswrapper[4867]: I1006 13:22:36.894548 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.010128 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-config-data\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.010371 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-scripts\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.010502 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.010538 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.010562 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-logs\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.010635 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.010743 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.010765 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-config-data-custom\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.010803 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrp48\" (UniqueName: \"kubernetes.io/projected/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-kube-api-access-vrp48\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.040353 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.045381 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.182:8080/\": dial tcp 10.217.0.182:8080: connect: connection refused" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.113237 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.113416 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.113571 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.114976 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-config-data-custom\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.115042 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrp48\" (UniqueName: \"kubernetes.io/projected/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-kube-api-access-vrp48\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.115192 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-config-data\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.115289 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-scripts\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.116208 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.116410 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.116443 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-logs\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.116484 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.120887 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-logs\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.121969 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.122185 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-config-data-custom\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.125848 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.126826 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-scripts\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.128477 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-config-data\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.129851 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.149829 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrp48\" (UniqueName: \"kubernetes.io/projected/678b77f0-1e51-4788-a7e0-4bc2560a9c6a-kube-api-access-vrp48\") pod \"cinder-api-0\" (UID: \"678b77f0-1e51-4788-a7e0-4bc2560a9c6a\") " pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.182492 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.250861 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.265719 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1325d7b5-2352-42c2-bed0-9105dd98593d" path="/var/lib/kubelet/pods/1325d7b5-2352-42c2-bed0-9105dd98593d/volumes" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.266387 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a642d87-db23-4d12-90ac-ebdfbfe00996" path="/var/lib/kubelet/pods/2a642d87-db23-4d12-90ac-ebdfbfe00996/volumes" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.282486 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86dbf54f95-fr228"] Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.282728 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86dbf54f95-fr228" podUID="a4334a9a-d6f0-418a-958e-755336a58527" containerName="dnsmasq-dns" containerID="cri-o://3adaf03bbf319a863a4fe29ba6bbb6df6f0f41e10bbc9bae81ee9b45129f3e32" gracePeriod=10 Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.373901 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.817634 4867 generic.go:334] "Generic (PLEG): container finished" podID="a4334a9a-d6f0-418a-958e-755336a58527" containerID="3adaf03bbf319a863a4fe29ba6bbb6df6f0f41e10bbc9bae81ee9b45129f3e32" exitCode=0 Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.817874 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dbf54f95-fr228" event={"ID":"a4334a9a-d6f0-418a-958e-755336a58527","Type":"ContainerDied","Data":"3adaf03bbf319a863a4fe29ba6bbb6df6f0f41e10bbc9bae81ee9b45129f3e32"} Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.823751 4867 generic.go:334] "Generic (PLEG): container finished" podID="f6218a59-1db5-4438-9fda-7781c1d4978b" containerID="2907d6d90ebdd474879af9266eaa64ceae93e8d4b8d59a82e2a647b608165118" exitCode=0 Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.823801 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f7bcb84f4-pcrvc" event={"ID":"f6218a59-1db5-4438-9fda-7781c1d4978b","Type":"ContainerDied","Data":"2907d6d90ebdd474879af9266eaa64ceae93e8d4b8d59a82e2a647b608165118"} Oct 06 13:22:37 crc kubenswrapper[4867]: I1006 13:22:37.920952 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.214148 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.383006 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-config\") pod \"a4334a9a-d6f0-418a-958e-755336a58527\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.383073 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-ovsdbserver-sb\") pod \"a4334a9a-d6f0-418a-958e-755336a58527\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.383215 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-ovsdbserver-nb\") pod \"a4334a9a-d6f0-418a-958e-755336a58527\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.383313 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-dns-svc\") pod \"a4334a9a-d6f0-418a-958e-755336a58527\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.383332 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsb6m\" (UniqueName: \"kubernetes.io/projected/a4334a9a-d6f0-418a-958e-755336a58527-kube-api-access-fsb6m\") pod \"a4334a9a-d6f0-418a-958e-755336a58527\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.383382 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-dns-swift-storage-0\") pod \"a4334a9a-d6f0-418a-958e-755336a58527\" (UID: \"a4334a9a-d6f0-418a-958e-755336a58527\") " Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.404770 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4334a9a-d6f0-418a-958e-755336a58527-kube-api-access-fsb6m" (OuterVolumeSpecName: "kube-api-access-fsb6m") pod "a4334a9a-d6f0-418a-958e-755336a58527" (UID: "a4334a9a-d6f0-418a-958e-755336a58527"). InnerVolumeSpecName "kube-api-access-fsb6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.485369 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsb6m\" (UniqueName: \"kubernetes.io/projected/a4334a9a-d6f0-418a-958e-755336a58527-kube-api-access-fsb6m\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.491414 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a4334a9a-d6f0-418a-958e-755336a58527" (UID: "a4334a9a-d6f0-418a-958e-755336a58527"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.533784 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4334a9a-d6f0-418a-958e-755336a58527" (UID: "a4334a9a-d6f0-418a-958e-755336a58527"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.548819 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a4334a9a-d6f0-418a-958e-755336a58527" (UID: "a4334a9a-d6f0-418a-958e-755336a58527"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.591638 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.591710 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.591726 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.655447 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a4334a9a-d6f0-418a-958e-755336a58527" (UID: "a4334a9a-d6f0-418a-958e-755336a58527"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.658984 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-config" (OuterVolumeSpecName: "config") pod "a4334a9a-d6f0-418a-958e-755336a58527" (UID: "a4334a9a-d6f0-418a-958e-755336a58527"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.693884 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.693916 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4334a9a-d6f0-418a-958e-755336a58527-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.847518 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86dbf54f95-fr228" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.848418 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86dbf54f95-fr228" event={"ID":"a4334a9a-d6f0-418a-958e-755336a58527","Type":"ContainerDied","Data":"8ef2deda54353d99c01f0aa42c5dd7c30961dea9ae094468e5ab0fab4a14f505"} Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.848524 4867 scope.go:117] "RemoveContainer" containerID="3adaf03bbf319a863a4fe29ba6bbb6df6f0f41e10bbc9bae81ee9b45129f3e32" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.863368 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"678b77f0-1e51-4788-a7e0-4bc2560a9c6a","Type":"ContainerStarted","Data":"2f5b529f9769723364def7a5898573c2291636c72955310021e7e94f27ca0341"} Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.970481 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86dbf54f95-fr228"] Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.975485 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.976194 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86dbf54f95-fr228"] Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.976294 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.976782 4867 scope.go:117] "RemoveContainer" containerID="2db97eecc610c3e34e2ba47ca51b7c8081ac513991aa1abd2d67d0e2841f3efe" Oct 06 13:22:38 crc kubenswrapper[4867]: I1006 13:22:38.976899 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:38 crc kubenswrapper[4867]: E1006 13:22:38.977270 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(c54babca-19bd-4a0b-a320-359b744ed066)\"" pod="openstack/watcher-decision-engine-0" podUID="c54babca-19bd-4a0b-a320-359b744ed066" Oct 06 13:22:39 crc kubenswrapper[4867]: I1006 13:22:39.236578 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4334a9a-d6f0-418a-958e-755336a58527" path="/var/lib/kubelet/pods/a4334a9a-d6f0-418a-958e-755336a58527/volumes" Oct 06 13:22:39 crc kubenswrapper[4867]: I1006 13:22:39.897110 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"678b77f0-1e51-4788-a7e0-4bc2560a9c6a","Type":"ContainerStarted","Data":"08238c7c4f1756455c2a9b340f81393d3cf58e3e0243adab3a0a5342c28db952"} Oct 06 13:22:39 crc kubenswrapper[4867]: I1006 13:22:39.897852 4867 scope.go:117] "RemoveContainer" containerID="2db97eecc610c3e34e2ba47ca51b7c8081ac513991aa1abd2d67d0e2841f3efe" Oct 06 13:22:39 crc kubenswrapper[4867]: I1006 13:22:39.992756 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:40 crc kubenswrapper[4867]: I1006 13:22:40.029203 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-594954fbc6-c2fc2" Oct 06 13:22:40 crc kubenswrapper[4867]: I1006 13:22:40.321809 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:22:40 crc kubenswrapper[4867]: I1006 13:22:40.411003 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-69d5cf7ffb-c2rgt" Oct 06 13:22:40 crc kubenswrapper[4867]: I1006 13:22:40.472911 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-577bfb968d-pw7pq"] Oct 06 13:22:40 crc kubenswrapper[4867]: I1006 13:22:40.906737 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-577bfb968d-pw7pq" podUID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" containerName="horizon-log" containerID="cri-o://5730aa1cc45c4bf8647b15c0406023a7d7a198e5c1a015a4e43025386176f686" gracePeriod=30 Oct 06 13:22:40 crc kubenswrapper[4867]: I1006 13:22:40.906868 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-577bfb968d-pw7pq" podUID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" containerName="horizon" containerID="cri-o://8fe961a303c8c5c9ae9a747ad9fdc3e8cb64fefa9af8883e831275e26699c6e1" gracePeriod=30 Oct 06 13:22:41 crc kubenswrapper[4867]: I1006 13:22:41.917694 4867 generic.go:334] "Generic (PLEG): container finished" podID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" containerID="8fe961a303c8c5c9ae9a747ad9fdc3e8cb64fefa9af8883e831275e26699c6e1" exitCode=0 Oct 06 13:22:41 crc kubenswrapper[4867]: I1006 13:22:41.917861 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-577bfb968d-pw7pq" event={"ID":"77541c32-3bc1-402d-aa9f-924f9b6cb37f","Type":"ContainerDied","Data":"8fe961a303c8c5c9ae9a747ad9fdc3e8cb64fefa9af8883e831275e26699c6e1"} Oct 06 13:22:42 crc kubenswrapper[4867]: I1006 13:22:42.118064 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-577bfb968d-pw7pq" podUID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.164:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.164:8443: connect: connection refused" Oct 06 13:22:42 crc kubenswrapper[4867]: I1006 13:22:42.228539 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 13:22:42 crc kubenswrapper[4867]: I1006 13:22:42.295510 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 13:22:42 crc kubenswrapper[4867]: I1006 13:22:42.928349 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" containerName="cinder-scheduler" containerID="cri-o://89a6aa6de775e66b0aa0b2f71bfec0c4ccd596c4612ed6ef676a3a5988aeeeb2" gracePeriod=30 Oct 06 13:22:42 crc kubenswrapper[4867]: I1006 13:22:42.928427 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" containerName="probe" containerID="cri-o://cb27dfcc69fb5281dca40d4de5e84a807e649445d57d8ecdffb071e7a42e89a4" gracePeriod=30 Oct 06 13:22:43 crc kubenswrapper[4867]: I1006 13:22:43.778506 4867 scope.go:117] "RemoveContainer" containerID="fc7e3f04f451d1fcc1bc448b4c361da1e444b060f8f92f675eb101095cfcaddb" Oct 06 13:22:43 crc kubenswrapper[4867]: I1006 13:22:43.807637 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:43 crc kubenswrapper[4867]: I1006 13:22:43.816954 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:22:43 crc kubenswrapper[4867]: I1006 13:22:43.899105 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86d4db6f74-khhjk" Oct 06 13:22:43 crc kubenswrapper[4867]: I1006 13:22:43.908013 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-httpd-config\") pod \"f6218a59-1db5-4438-9fda-7781c1d4978b\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " Oct 06 13:22:43 crc kubenswrapper[4867]: I1006 13:22:43.908138 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-ovndb-tls-certs\") pod \"f6218a59-1db5-4438-9fda-7781c1d4978b\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " Oct 06 13:22:43 crc kubenswrapper[4867]: I1006 13:22:43.908170 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-config\") pod \"f6218a59-1db5-4438-9fda-7781c1d4978b\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " Oct 06 13:22:43 crc kubenswrapper[4867]: I1006 13:22:43.908302 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-combined-ca-bundle\") pod \"f6218a59-1db5-4438-9fda-7781c1d4978b\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " Oct 06 13:22:43 crc kubenswrapper[4867]: I1006 13:22:43.908422 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s6ms\" (UniqueName: \"kubernetes.io/projected/f6218a59-1db5-4438-9fda-7781c1d4978b-kube-api-access-9s6ms\") pod \"f6218a59-1db5-4438-9fda-7781c1d4978b\" (UID: \"f6218a59-1db5-4438-9fda-7781c1d4978b\") " Oct 06 13:22:43 crc kubenswrapper[4867]: I1006 13:22:43.915488 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f6218a59-1db5-4438-9fda-7781c1d4978b" (UID: "f6218a59-1db5-4438-9fda-7781c1d4978b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:43 crc kubenswrapper[4867]: I1006 13:22:43.923397 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6218a59-1db5-4438-9fda-7781c1d4978b-kube-api-access-9s6ms" (OuterVolumeSpecName: "kube-api-access-9s6ms") pod "f6218a59-1db5-4438-9fda-7781c1d4978b" (UID: "f6218a59-1db5-4438-9fda-7781c1d4978b"). InnerVolumeSpecName "kube-api-access-9s6ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:43 crc kubenswrapper[4867]: I1006 13:22:43.978919 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-58bc94ddbb-kpk6w"] Oct 06 13:22:43 crc kubenswrapper[4867]: I1006 13:22:43.979243 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-58bc94ddbb-kpk6w" podUID="97cb892b-93d3-4c28-8c32-e2abaaa80dce" containerName="barbican-api-log" containerID="cri-o://d8758d3ca0058ce6cc97efb76387c8f23b72b5e8f190be9e2941e226b9989a2a" gracePeriod=30 Oct 06 13:22:43 crc kubenswrapper[4867]: I1006 13:22:43.979853 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-58bc94ddbb-kpk6w" podUID="97cb892b-93d3-4c28-8c32-e2abaaa80dce" containerName="barbican-api" containerID="cri-o://925011d1366643c86474669b03d4c9cd1af3c1c28b81be813f68d4b94ec7c743" gracePeriod=30 Oct 06 13:22:44 crc kubenswrapper[4867]: I1006 13:22:44.000612 4867 generic.go:334] "Generic (PLEG): container finished" podID="8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" containerID="cb27dfcc69fb5281dca40d4de5e84a807e649445d57d8ecdffb071e7a42e89a4" exitCode=0 Oct 06 13:22:44 crc kubenswrapper[4867]: I1006 13:22:44.000761 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f","Type":"ContainerDied","Data":"cb27dfcc69fb5281dca40d4de5e84a807e649445d57d8ecdffb071e7a42e89a4"} Oct 06 13:22:44 crc kubenswrapper[4867]: I1006 13:22:44.011699 4867 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:44 crc kubenswrapper[4867]: I1006 13:22:44.011734 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s6ms\" (UniqueName: \"kubernetes.io/projected/f6218a59-1db5-4438-9fda-7781c1d4978b-kube-api-access-9s6ms\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:44 crc kubenswrapper[4867]: I1006 13:22:44.033452 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f7bcb84f4-pcrvc" event={"ID":"f6218a59-1db5-4438-9fda-7781c1d4978b","Type":"ContainerDied","Data":"8c79e3d74cff0ba121ff474bc02fbab6af712544ad541b963a997aa7d34c54be"} Oct 06 13:22:44 crc kubenswrapper[4867]: I1006 13:22:44.033550 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f7bcb84f4-pcrvc" Oct 06 13:22:44 crc kubenswrapper[4867]: I1006 13:22:44.049638 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6218a59-1db5-4438-9fda-7781c1d4978b" (UID: "f6218a59-1db5-4438-9fda-7781c1d4978b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:44 crc kubenswrapper[4867]: I1006 13:22:44.056648 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-config" (OuterVolumeSpecName: "config") pod "f6218a59-1db5-4438-9fda-7781c1d4978b" (UID: "f6218a59-1db5-4438-9fda-7781c1d4978b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:44 crc kubenswrapper[4867]: I1006 13:22:44.114948 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:44 crc kubenswrapper[4867]: I1006 13:22:44.114993 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:44 crc kubenswrapper[4867]: I1006 13:22:44.130756 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f6218a59-1db5-4438-9fda-7781c1d4978b" (UID: "f6218a59-1db5-4438-9fda-7781c1d4978b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:44 crc kubenswrapper[4867]: I1006 13:22:44.216697 4867 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6218a59-1db5-4438-9fda-7781c1d4978b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:44 crc kubenswrapper[4867]: I1006 13:22:44.493792 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5f7bcb84f4-pcrvc"] Oct 06 13:22:44 crc kubenswrapper[4867]: I1006 13:22:44.512449 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5f7bcb84f4-pcrvc"] Oct 06 13:22:45 crc kubenswrapper[4867]: I1006 13:22:45.064535 4867 generic.go:334] "Generic (PLEG): container finished" podID="97cb892b-93d3-4c28-8c32-e2abaaa80dce" containerID="d8758d3ca0058ce6cc97efb76387c8f23b72b5e8f190be9e2941e226b9989a2a" exitCode=143 Oct 06 13:22:45 crc kubenswrapper[4867]: I1006 13:22:45.064591 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58bc94ddbb-kpk6w" event={"ID":"97cb892b-93d3-4c28-8c32-e2abaaa80dce","Type":"ContainerDied","Data":"d8758d3ca0058ce6cc97efb76387c8f23b72b5e8f190be9e2941e226b9989a2a"} Oct 06 13:22:45 crc kubenswrapper[4867]: I1006 13:22:45.066719 4867 generic.go:334] "Generic (PLEG): container finished" podID="8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" containerID="89a6aa6de775e66b0aa0b2f71bfec0c4ccd596c4612ed6ef676a3a5988aeeeb2" exitCode=0 Oct 06 13:22:45 crc kubenswrapper[4867]: I1006 13:22:45.066744 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f","Type":"ContainerDied","Data":"89a6aa6de775e66b0aa0b2f71bfec0c4ccd596c4612ed6ef676a3a5988aeeeb2"} Oct 06 13:22:45 crc kubenswrapper[4867]: I1006 13:22:45.243198 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6218a59-1db5-4438-9fda-7781c1d4978b" path="/var/lib/kubelet/pods/f6218a59-1db5-4438-9fda-7781c1d4978b/volumes" Oct 06 13:22:45 crc kubenswrapper[4867]: I1006 13:22:45.661463 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-58bc94ddbb-kpk6w" podUID="97cb892b-93d3-4c28-8c32-e2abaaa80dce" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.181:9311/healthcheck\": read tcp 10.217.0.2:46132->10.217.0.181:9311: read: connection reset by peer" Oct 06 13:22:45 crc kubenswrapper[4867]: I1006 13:22:45.662188 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-58bc94ddbb-kpk6w" podUID="97cb892b-93d3-4c28-8c32-e2abaaa80dce" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.181:9311/healthcheck\": read tcp 10.217.0.2:46134->10.217.0.181:9311: read: connection reset by peer" Oct 06 13:22:45 crc kubenswrapper[4867]: I1006 13:22:45.912180 4867 scope.go:117] "RemoveContainer" containerID="6ebdeedb331438c0c01a083fb0c28d23295c8aa33cc117bd9ce5f3ef209832d0" Oct 06 13:22:46 crc kubenswrapper[4867]: I1006 13:22:46.079545 4867 generic.go:334] "Generic (PLEG): container finished" podID="97cb892b-93d3-4c28-8c32-e2abaaa80dce" containerID="925011d1366643c86474669b03d4c9cd1af3c1c28b81be813f68d4b94ec7c743" exitCode=0 Oct 06 13:22:46 crc kubenswrapper[4867]: I1006 13:22:46.079639 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58bc94ddbb-kpk6w" event={"ID":"97cb892b-93d3-4c28-8c32-e2abaaa80dce","Type":"ContainerDied","Data":"925011d1366643c86474669b03d4c9cd1af3c1c28b81be813f68d4b94ec7c743"} Oct 06 13:22:46 crc kubenswrapper[4867]: I1006 13:22:46.773427 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-dcf7c7d6f-dz9mk" Oct 06 13:22:46 crc kubenswrapper[4867]: I1006 13:22:46.915883 4867 scope.go:117] "RemoveContainer" containerID="2907d6d90ebdd474879af9266eaa64ceae93e8d4b8d59a82e2a647b608165118" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.118391 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58bc94ddbb-kpk6w" event={"ID":"97cb892b-93d3-4c28-8c32-e2abaaa80dce","Type":"ContainerDied","Data":"d47779adb6edadfcc3f1b8f986bee4acec3d885cb7d9d8b243a1d1a396f1aabe"} Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.118827 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d47779adb6edadfcc3f1b8f986bee4acec3d885cb7d9d8b243a1d1a396f1aabe" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.124222 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.124454 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f","Type":"ContainerDied","Data":"63ef29b625b547d9d2e0b113b98572bf47901a6b63e948b8e28239f8cd8db653"} Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.124530 4867 scope.go:117] "RemoveContainer" containerID="cb27dfcc69fb5281dca40d4de5e84a807e649445d57d8ecdffb071e7a42e89a4" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.132942 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.187560 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-combined-ca-bundle\") pod \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.187660 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-combined-ca-bundle\") pod \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.187765 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shlzz\" (UniqueName: \"kubernetes.io/projected/97cb892b-93d3-4c28-8c32-e2abaaa80dce-kube-api-access-shlzz\") pod \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.187795 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gc6z\" (UniqueName: \"kubernetes.io/projected/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-kube-api-access-4gc6z\") pod \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.187839 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-scripts\") pod \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.187878 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97cb892b-93d3-4c28-8c32-e2abaaa80dce-logs\") pod \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.187945 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-config-data\") pod \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.187992 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-etc-machine-id\") pod \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.188041 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-config-data\") pod \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.188184 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-config-data-custom\") pod \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\" (UID: \"8db4cf0a-3639-491c-8a8d-bc1e0c802c7f\") " Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.188219 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-config-data-custom\") pod \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\" (UID: \"97cb892b-93d3-4c28-8c32-e2abaaa80dce\") " Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.189362 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" (UID: "8db4cf0a-3639-491c-8a8d-bc1e0c802c7f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.189889 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97cb892b-93d3-4c28-8c32-e2abaaa80dce-logs" (OuterVolumeSpecName: "logs") pod "97cb892b-93d3-4c28-8c32-e2abaaa80dce" (UID: "97cb892b-93d3-4c28-8c32-e2abaaa80dce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.195486 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "97cb892b-93d3-4c28-8c32-e2abaaa80dce" (UID: "97cb892b-93d3-4c28-8c32-e2abaaa80dce"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.205393 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-scripts" (OuterVolumeSpecName: "scripts") pod "8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" (UID: "8db4cf0a-3639-491c-8a8d-bc1e0c802c7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.205386 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97cb892b-93d3-4c28-8c32-e2abaaa80dce-kube-api-access-shlzz" (OuterVolumeSpecName: "kube-api-access-shlzz") pod "97cb892b-93d3-4c28-8c32-e2abaaa80dce" (UID: "97cb892b-93d3-4c28-8c32-e2abaaa80dce"). InnerVolumeSpecName "kube-api-access-shlzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.206530 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-kube-api-access-4gc6z" (OuterVolumeSpecName: "kube-api-access-4gc6z") pod "8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" (UID: "8db4cf0a-3639-491c-8a8d-bc1e0c802c7f"). InnerVolumeSpecName "kube-api-access-4gc6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.225441 4867 scope.go:117] "RemoveContainer" containerID="89a6aa6de775e66b0aa0b2f71bfec0c4ccd596c4612ed6ef676a3a5988aeeeb2" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.225630 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" (UID: "8db4cf0a-3639-491c-8a8d-bc1e0c802c7f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.291980 4867 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.292008 4867 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.292018 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shlzz\" (UniqueName: \"kubernetes.io/projected/97cb892b-93d3-4c28-8c32-e2abaaa80dce-kube-api-access-shlzz\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.292029 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gc6z\" (UniqueName: \"kubernetes.io/projected/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-kube-api-access-4gc6z\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.292038 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.292047 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97cb892b-93d3-4c28-8c32-e2abaaa80dce-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.292055 4867 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.416199 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97cb892b-93d3-4c28-8c32-e2abaaa80dce" (UID: "97cb892b-93d3-4c28-8c32-e2abaaa80dce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.432347 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-config-data" (OuterVolumeSpecName: "config-data") pod "97cb892b-93d3-4c28-8c32-e2abaaa80dce" (UID: "97cb892b-93d3-4c28-8c32-e2abaaa80dce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.476879 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" (UID: "8db4cf0a-3639-491c-8a8d-bc1e0c802c7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.496554 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.496589 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.496600 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97cb892b-93d3-4c28-8c32-e2abaaa80dce-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.502207 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-config-data" (OuterVolumeSpecName: "config-data") pod "8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" (UID: "8db4cf0a-3639-491c-8a8d-bc1e0c802c7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:47 crc kubenswrapper[4867]: I1006 13:22:47.598170 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.142741 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.146613 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c54babca-19bd-4a0b-a320-359b744ed066","Type":"ContainerStarted","Data":"890eb876f39d7213ab994e4f7878a7aa46933c45bc36e78c02b7645a4eba0da0"} Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.151024 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01","Type":"ContainerStarted","Data":"76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c"} Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.151112 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.151100 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="ceilometer-central-agent" containerID="cri-o://81b98af574ed60b3383cf920f2c60cfd94b5ed4a16ff5124ee45daada62be0c2" gracePeriod=30 Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.151142 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="ceilometer-notification-agent" containerID="cri-o://7d60efae9ddcdaf9392ac756bc8e229aa25ea248fd78f6ab866b8f740e697ee1" gracePeriod=30 Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.151150 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="sg-core" containerID="cri-o://4e70f0a45181766bb351837b030274b29f568109482d647d7d45c2c7711319e4" gracePeriod=30 Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.151130 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="proxy-httpd" containerID="cri-o://76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c" gracePeriod=30 Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.157552 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58bc94ddbb-kpk6w" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.157567 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"678b77f0-1e51-4788-a7e0-4bc2560a9c6a","Type":"ContainerStarted","Data":"fed4685bee2210ef9780b8078bdf4d2ba9476918323253552bc8b76e42ce70d1"} Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.157760 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.252332 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.307553 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.318937 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 13:22:48 crc kubenswrapper[4867]: E1006 13:22:48.319374 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4334a9a-d6f0-418a-958e-755336a58527" containerName="dnsmasq-dns" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319392 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4334a9a-d6f0-418a-958e-755336a58527" containerName="dnsmasq-dns" Oct 06 13:22:48 crc kubenswrapper[4867]: E1006 13:22:48.319406 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4334a9a-d6f0-418a-958e-755336a58527" containerName="init" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319415 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4334a9a-d6f0-418a-958e-755336a58527" containerName="init" Oct 06 13:22:48 crc kubenswrapper[4867]: E1006 13:22:48.319429 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97cb892b-93d3-4c28-8c32-e2abaaa80dce" containerName="barbican-api" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319435 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="97cb892b-93d3-4c28-8c32-e2abaaa80dce" containerName="barbican-api" Oct 06 13:22:48 crc kubenswrapper[4867]: E1006 13:22:48.319443 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97cb892b-93d3-4c28-8c32-e2abaaa80dce" containerName="barbican-api-log" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319449 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="97cb892b-93d3-4c28-8c32-e2abaaa80dce" containerName="barbican-api-log" Oct 06 13:22:48 crc kubenswrapper[4867]: E1006 13:22:48.319460 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" containerName="probe" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319466 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" containerName="probe" Oct 06 13:22:48 crc kubenswrapper[4867]: E1006 13:22:48.319481 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6218a59-1db5-4438-9fda-7781c1d4978b" containerName="neutron-httpd" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319488 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6218a59-1db5-4438-9fda-7781c1d4978b" containerName="neutron-httpd" Oct 06 13:22:48 crc kubenswrapper[4867]: E1006 13:22:48.319507 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" containerName="cinder-scheduler" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319513 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" containerName="cinder-scheduler" Oct 06 13:22:48 crc kubenswrapper[4867]: E1006 13:22:48.319527 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6218a59-1db5-4438-9fda-7781c1d4978b" containerName="neutron-api" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319533 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6218a59-1db5-4438-9fda-7781c1d4978b" containerName="neutron-api" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319710 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="97cb892b-93d3-4c28-8c32-e2abaaa80dce" containerName="barbican-api" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319724 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6218a59-1db5-4438-9fda-7781c1d4978b" containerName="neutron-api" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319735 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4334a9a-d6f0-418a-958e-755336a58527" containerName="dnsmasq-dns" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319742 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" containerName="cinder-scheduler" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319752 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="97cb892b-93d3-4c28-8c32-e2abaaa80dce" containerName="barbican-api-log" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319764 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6218a59-1db5-4438-9fda-7781c1d4978b" containerName="neutron-httpd" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.319775 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" containerName="probe" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.320877 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.323639 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.334338 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.343501 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.86833742 podStartE2EDuration="1m27.34348234s" podCreationTimestamp="2025-10-06 13:21:21 +0000 UTC" firstStartedPulling="2025-10-06 13:21:24.444694694 +0000 UTC m=+1063.902642838" lastFinishedPulling="2025-10-06 13:22:46.919839614 +0000 UTC m=+1146.377787758" observedRunningTime="2025-10-06 13:22:48.23301952 +0000 UTC m=+1147.690967664" watchObservedRunningTime="2025-10-06 13:22:48.34348234 +0000 UTC m=+1147.801430484" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.356159 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=12.356140066 podStartE2EDuration="12.356140066s" podCreationTimestamp="2025-10-06 13:22:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:48.281999839 +0000 UTC m=+1147.739947983" watchObservedRunningTime="2025-10-06 13:22:48.356140066 +0000 UTC m=+1147.814088210" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.366320 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-58bc94ddbb-kpk6w"] Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.373085 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-58bc94ddbb-kpk6w"] Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.423011 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3850291-2d24-472c-9ef2-7f2814c4c321-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.423095 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3850291-2d24-472c-9ef2-7f2814c4c321-scripts\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.423125 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3850291-2d24-472c-9ef2-7f2814c4c321-config-data\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.423151 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9qz7\" (UniqueName: \"kubernetes.io/projected/a3850291-2d24-472c-9ef2-7f2814c4c321-kube-api-access-r9qz7\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.423192 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3850291-2d24-472c-9ef2-7f2814c4c321-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.424365 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3850291-2d24-472c-9ef2-7f2814c4c321-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.525806 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3850291-2d24-472c-9ef2-7f2814c4c321-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.525878 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3850291-2d24-472c-9ef2-7f2814c4c321-scripts\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.525910 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3850291-2d24-472c-9ef2-7f2814c4c321-config-data\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.525935 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9qz7\" (UniqueName: \"kubernetes.io/projected/a3850291-2d24-472c-9ef2-7f2814c4c321-kube-api-access-r9qz7\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.525965 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3850291-2d24-472c-9ef2-7f2814c4c321-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.525999 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3850291-2d24-472c-9ef2-7f2814c4c321-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.526150 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3850291-2d24-472c-9ef2-7f2814c4c321-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.531933 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3850291-2d24-472c-9ef2-7f2814c4c321-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.533582 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3850291-2d24-472c-9ef2-7f2814c4c321-scripts\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.534576 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3850291-2d24-472c-9ef2-7f2814c4c321-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.543017 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3850291-2d24-472c-9ef2-7f2814c4c321-config-data\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.549977 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9qz7\" (UniqueName: \"kubernetes.io/projected/a3850291-2d24-472c-9ef2-7f2814c4c321-kube-api-access-r9qz7\") pod \"cinder-scheduler-0\" (UID: \"a3850291-2d24-472c-9ef2-7f2814c4c321\") " pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.650265 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 13:22:48 crc kubenswrapper[4867]: I1006 13:22:48.973659 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:49 crc kubenswrapper[4867]: I1006 13:22:49.015371 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:49 crc kubenswrapper[4867]: I1006 13:22:49.174233 4867 generic.go:334] "Generic (PLEG): container finished" podID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerID="76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c" exitCode=0 Oct 06 13:22:49 crc kubenswrapper[4867]: I1006 13:22:49.174309 4867 generic.go:334] "Generic (PLEG): container finished" podID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerID="4e70f0a45181766bb351837b030274b29f568109482d647d7d45c2c7711319e4" exitCode=2 Oct 06 13:22:49 crc kubenswrapper[4867]: I1006 13:22:49.174322 4867 generic.go:334] "Generic (PLEG): container finished" podID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerID="81b98af574ed60b3383cf920f2c60cfd94b5ed4a16ff5124ee45daada62be0c2" exitCode=0 Oct 06 13:22:49 crc kubenswrapper[4867]: I1006 13:22:49.174341 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01","Type":"ContainerDied","Data":"76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c"} Oct 06 13:22:49 crc kubenswrapper[4867]: I1006 13:22:49.174397 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01","Type":"ContainerDied","Data":"4e70f0a45181766bb351837b030274b29f568109482d647d7d45c2c7711319e4"} Oct 06 13:22:49 crc kubenswrapper[4867]: I1006 13:22:49.174407 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01","Type":"ContainerDied","Data":"81b98af574ed60b3383cf920f2c60cfd94b5ed4a16ff5124ee45daada62be0c2"} Oct 06 13:22:49 crc kubenswrapper[4867]: I1006 13:22:49.175579 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:49 crc kubenswrapper[4867]: I1006 13:22:49.176352 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 13:22:49 crc kubenswrapper[4867]: I1006 13:22:49.215547 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:49 crc kubenswrapper[4867]: I1006 13:22:49.238136 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db4cf0a-3639-491c-8a8d-bc1e0c802c7f" path="/var/lib/kubelet/pods/8db4cf0a-3639-491c-8a8d-bc1e0c802c7f/volumes" Oct 06 13:22:49 crc kubenswrapper[4867]: I1006 13:22:49.239403 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97cb892b-93d3-4c28-8c32-e2abaaa80dce" path="/var/lib/kubelet/pods/97cb892b-93d3-4c28-8c32-e2abaaa80dce/volumes" Oct 06 13:22:50 crc kubenswrapper[4867]: I1006 13:22:50.196836 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a3850291-2d24-472c-9ef2-7f2814c4c321","Type":"ContainerStarted","Data":"64b6261ab260db1e2cd706a8ccaa56b5cf5981d4d0de7e4ec3d40b79dfc4cef1"} Oct 06 13:22:50 crc kubenswrapper[4867]: I1006 13:22:50.197338 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a3850291-2d24-472c-9ef2-7f2814c4c321","Type":"ContainerStarted","Data":"374edcde7e52c1372a2bfacf4d8b972099c03405bb7b02dec7e5e3dce1e21933"} Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.212398 4867 generic.go:334] "Generic (PLEG): container finished" podID="c54babca-19bd-4a0b-a320-359b744ed066" containerID="890eb876f39d7213ab994e4f7878a7aa46933c45bc36e78c02b7645a4eba0da0" exitCode=1 Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.212568 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c54babca-19bd-4a0b-a320-359b744ed066","Type":"ContainerDied","Data":"890eb876f39d7213ab994e4f7878a7aa46933c45bc36e78c02b7645a4eba0da0"} Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.212995 4867 scope.go:117] "RemoveContainer" containerID="890eb876f39d7213ab994e4f7878a7aa46933c45bc36e78c02b7645a4eba0da0" Oct 06 13:22:51 crc kubenswrapper[4867]: E1006 13:22:51.213335 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(c54babca-19bd-4a0b-a320-359b744ed066)\"" pod="openstack/watcher-decision-engine-0" podUID="c54babca-19bd-4a0b-a320-359b744ed066" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.213398 4867 scope.go:117] "RemoveContainer" containerID="2db97eecc610c3e34e2ba47ca51b7c8081ac513991aa1abd2d67d0e2841f3efe" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.245416 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a3850291-2d24-472c-9ef2-7f2814c4c321","Type":"ContainerStarted","Data":"da305ef1adfc17c0d836f3f7edc6f9aede05ab2325974cbd84a32cd7bd1923c6"} Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.321698 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.321672862 podStartE2EDuration="3.321672862s" podCreationTimestamp="2025-10-06 13:22:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:51.311010981 +0000 UTC m=+1150.768959145" watchObservedRunningTime="2025-10-06 13:22:51.321672862 +0000 UTC m=+1150.779621006" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.441787 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.443181 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.445281 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.445401 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-jkkgk" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.446333 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.459239 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.504087 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4dd974b-0132-491e-8254-26c144b9c7a9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.504185 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d4dd974b-0132-491e-8254-26c144b9c7a9-openstack-config-secret\") pod \"openstackclient\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.504219 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d4dd974b-0132-491e-8254-26c144b9c7a9-openstack-config\") pod \"openstackclient\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.504952 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6kqf\" (UniqueName: \"kubernetes.io/projected/d4dd974b-0132-491e-8254-26c144b9c7a9-kube-api-access-k6kqf\") pod \"openstackclient\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.606637 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4dd974b-0132-491e-8254-26c144b9c7a9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.606704 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d4dd974b-0132-491e-8254-26c144b9c7a9-openstack-config-secret\") pod \"openstackclient\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.606737 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d4dd974b-0132-491e-8254-26c144b9c7a9-openstack-config\") pod \"openstackclient\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.606813 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6kqf\" (UniqueName: \"kubernetes.io/projected/d4dd974b-0132-491e-8254-26c144b9c7a9-kube-api-access-k6kqf\") pod \"openstackclient\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.607785 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d4dd974b-0132-491e-8254-26c144b9c7a9-openstack-config\") pod \"openstackclient\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.614897 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d4dd974b-0132-491e-8254-26c144b9c7a9-openstack-config-secret\") pod \"openstackclient\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.614983 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4dd974b-0132-491e-8254-26c144b9c7a9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.625802 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6kqf\" (UniqueName: \"kubernetes.io/projected/d4dd974b-0132-491e-8254-26c144b9c7a9-kube-api-access-k6kqf\") pod \"openstackclient\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.654732 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.655772 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.680678 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.696552 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.698290 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.710193 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.810020 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7620829-b468-470c-899e-92faea8bc3c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b7620829-b468-470c-899e-92faea8bc3c7\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.810134 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b7620829-b468-470c-899e-92faea8bc3c7-openstack-config\") pod \"openstackclient\" (UID: \"b7620829-b468-470c-899e-92faea8bc3c7\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.810177 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b7620829-b468-470c-899e-92faea8bc3c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"b7620829-b468-470c-899e-92faea8bc3c7\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.810195 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnhb7\" (UniqueName: \"kubernetes.io/projected/b7620829-b468-470c-899e-92faea8bc3c7-kube-api-access-hnhb7\") pod \"openstackclient\" (UID: \"b7620829-b468-470c-899e-92faea8bc3c7\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.913062 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b7620829-b468-470c-899e-92faea8bc3c7-openstack-config\") pod \"openstackclient\" (UID: \"b7620829-b468-470c-899e-92faea8bc3c7\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.913497 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b7620829-b468-470c-899e-92faea8bc3c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"b7620829-b468-470c-899e-92faea8bc3c7\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.913516 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhb7\" (UniqueName: \"kubernetes.io/projected/b7620829-b468-470c-899e-92faea8bc3c7-kube-api-access-hnhb7\") pod \"openstackclient\" (UID: \"b7620829-b468-470c-899e-92faea8bc3c7\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.913592 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7620829-b468-470c-899e-92faea8bc3c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b7620829-b468-470c-899e-92faea8bc3c7\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.914752 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b7620829-b468-470c-899e-92faea8bc3c7-openstack-config\") pod \"openstackclient\" (UID: \"b7620829-b468-470c-899e-92faea8bc3c7\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.918846 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7620829-b468-470c-899e-92faea8bc3c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b7620829-b468-470c-899e-92faea8bc3c7\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.919777 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b7620829-b468-470c-899e-92faea8bc3c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"b7620829-b468-470c-899e-92faea8bc3c7\") " pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: E1006 13:22:51.925709 4867 log.go:32] "RunPodSandbox from runtime service failed" err=< Oct 06 13:22:51 crc kubenswrapper[4867]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_d4dd974b-0132-491e-8254-26c144b9c7a9_0(ed6e080ccbfada5ff01dff97d5ab4136fd66dae2e9983402857ba573e0101337): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ed6e080ccbfada5ff01dff97d5ab4136fd66dae2e9983402857ba573e0101337" Netns:"/var/run/netns/000e7365-4f06-44f2-b022-6e294316df03" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=ed6e080ccbfada5ff01dff97d5ab4136fd66dae2e9983402857ba573e0101337;K8S_POD_UID=d4dd974b-0132-491e-8254-26c144b9c7a9" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/d4dd974b-0132-491e-8254-26c144b9c7a9]: expected pod UID "d4dd974b-0132-491e-8254-26c144b9c7a9" but got "b7620829-b468-470c-899e-92faea8bc3c7" from Kube API Oct 06 13:22:51 crc kubenswrapper[4867]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 06 13:22:51 crc kubenswrapper[4867]: > Oct 06 13:22:51 crc kubenswrapper[4867]: E1006 13:22:51.925775 4867 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Oct 06 13:22:51 crc kubenswrapper[4867]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_d4dd974b-0132-491e-8254-26c144b9c7a9_0(ed6e080ccbfada5ff01dff97d5ab4136fd66dae2e9983402857ba573e0101337): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ed6e080ccbfada5ff01dff97d5ab4136fd66dae2e9983402857ba573e0101337" Netns:"/var/run/netns/000e7365-4f06-44f2-b022-6e294316df03" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=ed6e080ccbfada5ff01dff97d5ab4136fd66dae2e9983402857ba573e0101337;K8S_POD_UID=d4dd974b-0132-491e-8254-26c144b9c7a9" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/d4dd974b-0132-491e-8254-26c144b9c7a9]: expected pod UID "d4dd974b-0132-491e-8254-26c144b9c7a9" but got "b7620829-b468-470c-899e-92faea8bc3c7" from Kube API Oct 06 13:22:51 crc kubenswrapper[4867]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 06 13:22:51 crc kubenswrapper[4867]: > pod="openstack/openstackclient" Oct 06 13:22:51 crc kubenswrapper[4867]: I1006 13:22:51.932688 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnhb7\" (UniqueName: \"kubernetes.io/projected/b7620829-b468-470c-899e-92faea8bc3c7-kube-api-access-hnhb7\") pod \"openstackclient\" (UID: \"b7620829-b468-470c-899e-92faea8bc3c7\") " pod="openstack/openstackclient" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.118200 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-577bfb968d-pw7pq" podUID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.164:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.164:8443: connect: connection refused" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.164868 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.244274 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.245609 4867 scope.go:117] "RemoveContainer" containerID="890eb876f39d7213ab994e4f7878a7aa46933c45bc36e78c02b7645a4eba0da0" Oct 06 13:22:52 crc kubenswrapper[4867]: E1006 13:22:52.245861 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(c54babca-19bd-4a0b-a320-359b744ed066)\"" pod="openstack/watcher-decision-engine-0" podUID="c54babca-19bd-4a0b-a320-359b744ed066" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.250206 4867 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="d4dd974b-0132-491e-8254-26c144b9c7a9" podUID="b7620829-b468-470c-899e-92faea8bc3c7" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.325246 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.425447 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4dd974b-0132-491e-8254-26c144b9c7a9-combined-ca-bundle\") pod \"d4dd974b-0132-491e-8254-26c144b9c7a9\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.425504 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d4dd974b-0132-491e-8254-26c144b9c7a9-openstack-config-secret\") pod \"d4dd974b-0132-491e-8254-26c144b9c7a9\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.425809 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d4dd974b-0132-491e-8254-26c144b9c7a9-openstack-config\") pod \"d4dd974b-0132-491e-8254-26c144b9c7a9\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.426064 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6kqf\" (UniqueName: \"kubernetes.io/projected/d4dd974b-0132-491e-8254-26c144b9c7a9-kube-api-access-k6kqf\") pod \"d4dd974b-0132-491e-8254-26c144b9c7a9\" (UID: \"d4dd974b-0132-491e-8254-26c144b9c7a9\") " Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.426798 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4dd974b-0132-491e-8254-26c144b9c7a9-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d4dd974b-0132-491e-8254-26c144b9c7a9" (UID: "d4dd974b-0132-491e-8254-26c144b9c7a9"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.436558 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4dd974b-0132-491e-8254-26c144b9c7a9-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d4dd974b-0132-491e-8254-26c144b9c7a9" (UID: "d4dd974b-0132-491e-8254-26c144b9c7a9"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.441027 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4dd974b-0132-491e-8254-26c144b9c7a9-kube-api-access-k6kqf" (OuterVolumeSpecName: "kube-api-access-k6kqf") pod "d4dd974b-0132-491e-8254-26c144b9c7a9" (UID: "d4dd974b-0132-491e-8254-26c144b9c7a9"). InnerVolumeSpecName "kube-api-access-k6kqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.441400 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4dd974b-0132-491e-8254-26c144b9c7a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4dd974b-0132-491e-8254-26c144b9c7a9" (UID: "d4dd974b-0132-491e-8254-26c144b9c7a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.529625 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d4dd974b-0132-491e-8254-26c144b9c7a9-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.529700 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6kqf\" (UniqueName: \"kubernetes.io/projected/d4dd974b-0132-491e-8254-26c144b9c7a9-kube-api-access-k6kqf\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.529715 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4dd974b-0132-491e-8254-26c144b9c7a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.529725 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d4dd974b-0132-491e-8254-26c144b9c7a9-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.707997 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.932356 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7b666bc78f-zvlqd"] Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.934964 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.938759 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.938952 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.939070 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 06 13:22:52 crc kubenswrapper[4867]: I1006 13:22:52.953222 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7b666bc78f-zvlqd"] Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.047590 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06d3199-78ee-4389-bbd2-2bc53c012c84-run-httpd\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.047666 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a06d3199-78ee-4389-bbd2-2bc53c012c84-internal-tls-certs\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.047688 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a06d3199-78ee-4389-bbd2-2bc53c012c84-etc-swift\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.047726 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9frq\" (UniqueName: \"kubernetes.io/projected/a06d3199-78ee-4389-bbd2-2bc53c012c84-kube-api-access-n9frq\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.047747 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06d3199-78ee-4389-bbd2-2bc53c012c84-combined-ca-bundle\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.047783 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a06d3199-78ee-4389-bbd2-2bc53c012c84-public-tls-certs\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.047802 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06d3199-78ee-4389-bbd2-2bc53c012c84-log-httpd\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.047824 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06d3199-78ee-4389-bbd2-2bc53c012c84-config-data\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.150620 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06d3199-78ee-4389-bbd2-2bc53c012c84-run-httpd\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.150728 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a06d3199-78ee-4389-bbd2-2bc53c012c84-internal-tls-certs\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.150763 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a06d3199-78ee-4389-bbd2-2bc53c012c84-etc-swift\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.150819 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06d3199-78ee-4389-bbd2-2bc53c012c84-combined-ca-bundle\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.150839 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9frq\" (UniqueName: \"kubernetes.io/projected/a06d3199-78ee-4389-bbd2-2bc53c012c84-kube-api-access-n9frq\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.150885 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a06d3199-78ee-4389-bbd2-2bc53c012c84-public-tls-certs\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.150908 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06d3199-78ee-4389-bbd2-2bc53c012c84-log-httpd\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.150936 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06d3199-78ee-4389-bbd2-2bc53c012c84-config-data\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.151448 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06d3199-78ee-4389-bbd2-2bc53c012c84-run-httpd\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.151771 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06d3199-78ee-4389-bbd2-2bc53c012c84-log-httpd\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.158366 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06d3199-78ee-4389-bbd2-2bc53c012c84-config-data\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.160128 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a06d3199-78ee-4389-bbd2-2bc53c012c84-public-tls-certs\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.168455 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a06d3199-78ee-4389-bbd2-2bc53c012c84-etc-swift\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.171123 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06d3199-78ee-4389-bbd2-2bc53c012c84-combined-ca-bundle\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.197280 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9frq\" (UniqueName: \"kubernetes.io/projected/a06d3199-78ee-4389-bbd2-2bc53c012c84-kube-api-access-n9frq\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.205019 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a06d3199-78ee-4389-bbd2-2bc53c012c84-internal-tls-certs\") pod \"swift-proxy-7b666bc78f-zvlqd\" (UID: \"a06d3199-78ee-4389-bbd2-2bc53c012c84\") " pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.267697 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.299581 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4dd974b-0132-491e-8254-26c144b9c7a9" path="/var/lib/kubelet/pods/d4dd974b-0132-491e-8254-26c144b9c7a9/volumes" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.299819 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.303272 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b7620829-b468-470c-899e-92faea8bc3c7","Type":"ContainerStarted","Data":"c8bc146103eec4b0a6ca22f408ffb7025de8f50a330d9f2ca53667e887540257"} Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.416613 4867 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="d4dd974b-0132-491e-8254-26c144b9c7a9" podUID="b7620829-b468-470c-899e-92faea8bc3c7" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.652569 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 13:22:53 crc kubenswrapper[4867]: I1006 13:22:53.833721 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7b666bc78f-zvlqd"] Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.169360 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.278724 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-run-httpd\") pod \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.278814 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-scripts\") pod \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.278847 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-sg-core-conf-yaml\") pod \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.278887 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-log-httpd\") pod \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.278909 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-config-data\") pod \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.279044 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-combined-ca-bundle\") pod \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.279298 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57cch\" (UniqueName: \"kubernetes.io/projected/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-kube-api-access-57cch\") pod \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\" (UID: \"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01\") " Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.282114 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" (UID: "6fadf16c-2d2d-48bd-ab16-c2fac8c46b01"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.282145 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" (UID: "6fadf16c-2d2d-48bd-ab16-c2fac8c46b01"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.287488 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-scripts" (OuterVolumeSpecName: "scripts") pod "6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" (UID: "6fadf16c-2d2d-48bd-ab16-c2fac8c46b01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.289623 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-kube-api-access-57cch" (OuterVolumeSpecName: "kube-api-access-57cch") pod "6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" (UID: "6fadf16c-2d2d-48bd-ab16-c2fac8c46b01"). InnerVolumeSpecName "kube-api-access-57cch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.318278 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b666bc78f-zvlqd" event={"ID":"a06d3199-78ee-4389-bbd2-2bc53c012c84","Type":"ContainerStarted","Data":"1f7387264d0959e3af89cf14852ed55efac8edc571ed31876052fec3b4149928"} Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.318333 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b666bc78f-zvlqd" event={"ID":"a06d3199-78ee-4389-bbd2-2bc53c012c84","Type":"ContainerStarted","Data":"0181ccf3d582494ed903f2a04d946af9f46c07a26f2491a5308308dde0f80174"} Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.332516 4867 generic.go:334] "Generic (PLEG): container finished" podID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerID="7d60efae9ddcdaf9392ac756bc8e229aa25ea248fd78f6ab866b8f740e697ee1" exitCode=0 Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.332577 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01","Type":"ContainerDied","Data":"7d60efae9ddcdaf9392ac756bc8e229aa25ea248fd78f6ab866b8f740e697ee1"} Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.332613 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6fadf16c-2d2d-48bd-ab16-c2fac8c46b01","Type":"ContainerDied","Data":"a67fffe53a2617487eb3ce45a041622c565865ef52426b3440a27e771957b294"} Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.332637 4867 scope.go:117] "RemoveContainer" containerID="76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.333276 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.334727 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" (UID: "6fadf16c-2d2d-48bd-ab16-c2fac8c46b01"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.386293 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57cch\" (UniqueName: \"kubernetes.io/projected/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-kube-api-access-57cch\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.386329 4867 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.386344 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.386355 4867 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.386367 4867 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.415040 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" (UID: "6fadf16c-2d2d-48bd-ab16-c2fac8c46b01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.431436 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-config-data" (OuterVolumeSpecName: "config-data") pod "6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" (UID: "6fadf16c-2d2d-48bd-ab16-c2fac8c46b01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.488450 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.488689 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.519334 4867 scope.go:117] "RemoveContainer" containerID="4e70f0a45181766bb351837b030274b29f568109482d647d7d45c2c7711319e4" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.562983 4867 scope.go:117] "RemoveContainer" containerID="7d60efae9ddcdaf9392ac756bc8e229aa25ea248fd78f6ab866b8f740e697ee1" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.604632 4867 scope.go:117] "RemoveContainer" containerID="81b98af574ed60b3383cf920f2c60cfd94b5ed4a16ff5124ee45daada62be0c2" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.639723 4867 scope.go:117] "RemoveContainer" containerID="76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c" Oct 06 13:22:54 crc kubenswrapper[4867]: E1006 13:22:54.642546 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c\": container with ID starting with 76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c not found: ID does not exist" containerID="76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.642594 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c"} err="failed to get container status \"76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c\": rpc error: code = NotFound desc = could not find container \"76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c\": container with ID starting with 76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c not found: ID does not exist" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.642625 4867 scope.go:117] "RemoveContainer" containerID="4e70f0a45181766bb351837b030274b29f568109482d647d7d45c2c7711319e4" Oct 06 13:22:54 crc kubenswrapper[4867]: E1006 13:22:54.643465 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e70f0a45181766bb351837b030274b29f568109482d647d7d45c2c7711319e4\": container with ID starting with 4e70f0a45181766bb351837b030274b29f568109482d647d7d45c2c7711319e4 not found: ID does not exist" containerID="4e70f0a45181766bb351837b030274b29f568109482d647d7d45c2c7711319e4" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.643501 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e70f0a45181766bb351837b030274b29f568109482d647d7d45c2c7711319e4"} err="failed to get container status \"4e70f0a45181766bb351837b030274b29f568109482d647d7d45c2c7711319e4\": rpc error: code = NotFound desc = could not find container \"4e70f0a45181766bb351837b030274b29f568109482d647d7d45c2c7711319e4\": container with ID starting with 4e70f0a45181766bb351837b030274b29f568109482d647d7d45c2c7711319e4 not found: ID does not exist" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.643531 4867 scope.go:117] "RemoveContainer" containerID="7d60efae9ddcdaf9392ac756bc8e229aa25ea248fd78f6ab866b8f740e697ee1" Oct 06 13:22:54 crc kubenswrapper[4867]: E1006 13:22:54.644052 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d60efae9ddcdaf9392ac756bc8e229aa25ea248fd78f6ab866b8f740e697ee1\": container with ID starting with 7d60efae9ddcdaf9392ac756bc8e229aa25ea248fd78f6ab866b8f740e697ee1 not found: ID does not exist" containerID="7d60efae9ddcdaf9392ac756bc8e229aa25ea248fd78f6ab866b8f740e697ee1" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.644080 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d60efae9ddcdaf9392ac756bc8e229aa25ea248fd78f6ab866b8f740e697ee1"} err="failed to get container status \"7d60efae9ddcdaf9392ac756bc8e229aa25ea248fd78f6ab866b8f740e697ee1\": rpc error: code = NotFound desc = could not find container \"7d60efae9ddcdaf9392ac756bc8e229aa25ea248fd78f6ab866b8f740e697ee1\": container with ID starting with 7d60efae9ddcdaf9392ac756bc8e229aa25ea248fd78f6ab866b8f740e697ee1 not found: ID does not exist" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.644097 4867 scope.go:117] "RemoveContainer" containerID="81b98af574ed60b3383cf920f2c60cfd94b5ed4a16ff5124ee45daada62be0c2" Oct 06 13:22:54 crc kubenswrapper[4867]: E1006 13:22:54.644806 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81b98af574ed60b3383cf920f2c60cfd94b5ed4a16ff5124ee45daada62be0c2\": container with ID starting with 81b98af574ed60b3383cf920f2c60cfd94b5ed4a16ff5124ee45daada62be0c2 not found: ID does not exist" containerID="81b98af574ed60b3383cf920f2c60cfd94b5ed4a16ff5124ee45daada62be0c2" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.644842 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81b98af574ed60b3383cf920f2c60cfd94b5ed4a16ff5124ee45daada62be0c2"} err="failed to get container status \"81b98af574ed60b3383cf920f2c60cfd94b5ed4a16ff5124ee45daada62be0c2\": rpc error: code = NotFound desc = could not find container \"81b98af574ed60b3383cf920f2c60cfd94b5ed4a16ff5124ee45daada62be0c2\": container with ID starting with 81b98af574ed60b3383cf920f2c60cfd94b5ed4a16ff5124ee45daada62be0c2 not found: ID does not exist" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.684152 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.705443 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.728530 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:22:54 crc kubenswrapper[4867]: E1006 13:22:54.729029 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="ceilometer-central-agent" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.729048 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="ceilometer-central-agent" Oct 06 13:22:54 crc kubenswrapper[4867]: E1006 13:22:54.729071 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="sg-core" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.729077 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="sg-core" Oct 06 13:22:54 crc kubenswrapper[4867]: E1006 13:22:54.729090 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="proxy-httpd" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.729096 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="proxy-httpd" Oct 06 13:22:54 crc kubenswrapper[4867]: E1006 13:22:54.729113 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="ceilometer-notification-agent" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.729118 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="ceilometer-notification-agent" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.729312 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="ceilometer-notification-agent" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.729332 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="sg-core" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.729346 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="proxy-httpd" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.729355 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" containerName="ceilometer-central-agent" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.731144 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.734280 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.739533 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.744847 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.799445 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-log-httpd\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.799491 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqbql\" (UniqueName: \"kubernetes.io/projected/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-kube-api-access-gqbql\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.799611 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-config-data\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.799641 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-scripts\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.799863 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.800088 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.800789 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-run-httpd\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.902814 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-log-httpd\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.902855 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqbql\" (UniqueName: \"kubernetes.io/projected/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-kube-api-access-gqbql\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.902901 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-config-data\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.902931 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-scripts\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.903014 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.903045 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.903093 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-run-httpd\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.903564 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-run-httpd\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.903558 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-log-httpd\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.912737 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.912771 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-scripts\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.912862 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.916982 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-config-data\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:54 crc kubenswrapper[4867]: I1006 13:22:54.936397 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqbql\" (UniqueName: \"kubernetes.io/projected/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-kube-api-access-gqbql\") pod \"ceilometer-0\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " pod="openstack/ceilometer-0" Oct 06 13:22:55 crc kubenswrapper[4867]: I1006 13:22:55.068767 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:22:55 crc kubenswrapper[4867]: I1006 13:22:55.251957 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fadf16c-2d2d-48bd-ab16-c2fac8c46b01" path="/var/lib/kubelet/pods/6fadf16c-2d2d-48bd-ab16-c2fac8c46b01/volumes" Oct 06 13:22:55 crc kubenswrapper[4867]: I1006 13:22:55.361771 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b666bc78f-zvlqd" event={"ID":"a06d3199-78ee-4389-bbd2-2bc53c012c84","Type":"ContainerStarted","Data":"89fb17cd5d4892f1eabfc4b117c77445b99318f67f6e0e92039056215961139d"} Oct 06 13:22:55 crc kubenswrapper[4867]: I1006 13:22:55.363578 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:55 crc kubenswrapper[4867]: I1006 13:22:55.363607 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:22:55 crc kubenswrapper[4867]: I1006 13:22:55.547694 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7b666bc78f-zvlqd" podStartSLOduration=3.547678103 podStartE2EDuration="3.547678103s" podCreationTimestamp="2025-10-06 13:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:22:55.388605413 +0000 UTC m=+1154.846553557" watchObservedRunningTime="2025-10-06 13:22:55.547678103 +0000 UTC m=+1155.005626247" Oct 06 13:22:55 crc kubenswrapper[4867]: I1006 13:22:55.558459 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:22:55 crc kubenswrapper[4867]: I1006 13:22:55.651676 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 06 13:22:56 crc kubenswrapper[4867]: I1006 13:22:56.435715 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b","Type":"ContainerStarted","Data":"109cc83caec02f53f1494c29e5c689f8c9d56b7b58d38eb20781fd574a6e1dbb"} Oct 06 13:22:56 crc kubenswrapper[4867]: I1006 13:22:56.436403 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b","Type":"ContainerStarted","Data":"854acc0519462d93fa28d43f1a56a745d08970aa494e5a5da59bb1f9d62b9e12"} Oct 06 13:22:56 crc kubenswrapper[4867]: I1006 13:22:56.436436 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b","Type":"ContainerStarted","Data":"fbb39d98b51b35996a18779cf42f5737006cbb27c7e4d3c204968c0d9a6afed3"} Oct 06 13:22:57 crc kubenswrapper[4867]: I1006 13:22:57.890993 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:22:58 crc kubenswrapper[4867]: I1006 13:22:58.461502 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b","Type":"ContainerStarted","Data":"02163191074d8e9cbe01ea8d0eace1b67524b94ab39d5fcbcb9d959cf49249bb"} Oct 06 13:22:58 crc kubenswrapper[4867]: I1006 13:22:58.887519 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 13:22:58 crc kubenswrapper[4867]: I1006 13:22:58.976478 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 13:22:58 crc kubenswrapper[4867]: I1006 13:22:58.977727 4867 scope.go:117] "RemoveContainer" containerID="890eb876f39d7213ab994e4f7878a7aa46933c45bc36e78c02b7645a4eba0da0" Oct 06 13:22:58 crc kubenswrapper[4867]: E1006 13:22:58.978261 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(c54babca-19bd-4a0b-a320-359b744ed066)\"" pod="openstack/watcher-decision-engine-0" podUID="c54babca-19bd-4a0b-a320-359b744ed066" Oct 06 13:22:59 crc kubenswrapper[4867]: E1006 13:22:59.594296 4867 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/bbabd46997abe7ed54655d0377cf712c980f18ee0f0a905b222915d80bde7ef6/diff" to get inode usage: stat /var/lib/containers/storage/overlay/bbabd46997abe7ed54655d0377cf712c980f18ee0f0a905b222915d80bde7ef6/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_ceilometer-0_6fadf16c-2d2d-48bd-ab16-c2fac8c46b01/ceilometer-central-agent/0.log" to get inode usage: stat /var/log/pods/openstack_ceilometer-0_6fadf16c-2d2d-48bd-ab16-c2fac8c46b01/ceilometer-central-agent/0.log: no such file or directory Oct 06 13:23:00 crc kubenswrapper[4867]: E1006 13:23:00.089915 4867 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/91e13c0f50635ff3d26f7ad5c04c0a2a9b833d418cd9f5a6aac5dda63ee7fd89/diff" to get inode usage: stat /var/lib/containers/storage/overlay/91e13c0f50635ff3d26f7ad5c04c0a2a9b833d418cd9f5a6aac5dda63ee7fd89/diff: no such file or directory, extraDiskErr: Oct 06 13:23:00 crc kubenswrapper[4867]: I1006 13:23:00.552655 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b","Type":"ContainerStarted","Data":"ffebc9c3a6736a8e2cf8e7d47affbb8fc319a22d240cf117407fa934d86b14bb"} Oct 06 13:23:00 crc kubenswrapper[4867]: I1006 13:23:00.553042 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="ceilometer-central-agent" containerID="cri-o://854acc0519462d93fa28d43f1a56a745d08970aa494e5a5da59bb1f9d62b9e12" gracePeriod=30 Oct 06 13:23:00 crc kubenswrapper[4867]: I1006 13:23:00.553541 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 13:23:00 crc kubenswrapper[4867]: I1006 13:23:00.553581 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="proxy-httpd" containerID="cri-o://ffebc9c3a6736a8e2cf8e7d47affbb8fc319a22d240cf117407fa934d86b14bb" gracePeriod=30 Oct 06 13:23:00 crc kubenswrapper[4867]: I1006 13:23:00.553688 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="sg-core" containerID="cri-o://02163191074d8e9cbe01ea8d0eace1b67524b94ab39d5fcbcb9d959cf49249bb" gracePeriod=30 Oct 06 13:23:00 crc kubenswrapper[4867]: I1006 13:23:00.553760 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="ceilometer-notification-agent" containerID="cri-o://109cc83caec02f53f1494c29e5c689f8c9d56b7b58d38eb20781fd574a6e1dbb" gracePeriod=30 Oct 06 13:23:00 crc kubenswrapper[4867]: I1006 13:23:00.612837 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.987629984 podStartE2EDuration="6.612804647s" podCreationTimestamp="2025-10-06 13:22:54 +0000 UTC" firstStartedPulling="2025-10-06 13:22:55.570438105 +0000 UTC m=+1155.028386249" lastFinishedPulling="2025-10-06 13:22:59.195612608 +0000 UTC m=+1158.653560912" observedRunningTime="2025-10-06 13:23:00.587760882 +0000 UTC m=+1160.045709036" watchObservedRunningTime="2025-10-06 13:23:00.612804647 +0000 UTC m=+1160.070752791" Oct 06 13:23:01 crc kubenswrapper[4867]: I1006 13:23:01.564445 4867 generic.go:334] "Generic (PLEG): container finished" podID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerID="ffebc9c3a6736a8e2cf8e7d47affbb8fc319a22d240cf117407fa934d86b14bb" exitCode=0 Oct 06 13:23:01 crc kubenswrapper[4867]: I1006 13:23:01.565045 4867 generic.go:334] "Generic (PLEG): container finished" podID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerID="02163191074d8e9cbe01ea8d0eace1b67524b94ab39d5fcbcb9d959cf49249bb" exitCode=2 Oct 06 13:23:01 crc kubenswrapper[4867]: I1006 13:23:01.565122 4867 generic.go:334] "Generic (PLEG): container finished" podID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerID="109cc83caec02f53f1494c29e5c689f8c9d56b7b58d38eb20781fd574a6e1dbb" exitCode=0 Oct 06 13:23:01 crc kubenswrapper[4867]: I1006 13:23:01.564573 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b","Type":"ContainerDied","Data":"ffebc9c3a6736a8e2cf8e7d47affbb8fc319a22d240cf117407fa934d86b14bb"} Oct 06 13:23:01 crc kubenswrapper[4867]: I1006 13:23:01.565316 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b","Type":"ContainerDied","Data":"02163191074d8e9cbe01ea8d0eace1b67524b94ab39d5fcbcb9d959cf49249bb"} Oct 06 13:23:01 crc kubenswrapper[4867]: I1006 13:23:01.565420 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b","Type":"ContainerDied","Data":"109cc83caec02f53f1494c29e5c689f8c9d56b7b58d38eb20781fd574a6e1dbb"} Oct 06 13:23:02 crc kubenswrapper[4867]: I1006 13:23:02.119073 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-577bfb968d-pw7pq" podUID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.164:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.164:8443: connect: connection refused" Oct 06 13:23:02 crc kubenswrapper[4867]: I1006 13:23:02.119237 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:23:02 crc kubenswrapper[4867]: E1006 13:23:02.983698 4867 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/1ba67fabbaf2f7ec778fc51d4e99d1130b5c4836eaf6bc0f9460550a99f9229d/diff" to get inode usage: stat /var/lib/containers/storage/overlay/1ba67fabbaf2f7ec778fc51d4e99d1130b5c4836eaf6bc0f9460550a99f9229d/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_ceilometer-0_6fadf16c-2d2d-48bd-ab16-c2fac8c46b01/ceilometer-notification-agent/0.log" to get inode usage: stat /var/log/pods/openstack_ceilometer-0_6fadf16c-2d2d-48bd-ab16-c2fac8c46b01/ceilometer-notification-agent/0.log: no such file or directory Oct 06 13:23:03 crc kubenswrapper[4867]: I1006 13:23:03.274207 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:23:03 crc kubenswrapper[4867]: I1006 13:23:03.274976 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7b666bc78f-zvlqd" Oct 06 13:23:03 crc kubenswrapper[4867]: I1006 13:23:03.592518 4867 generic.go:334] "Generic (PLEG): container finished" podID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerID="854acc0519462d93fa28d43f1a56a745d08970aa494e5a5da59bb1f9d62b9e12" exitCode=0 Oct 06 13:23:03 crc kubenswrapper[4867]: I1006 13:23:03.592610 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b","Type":"ContainerDied","Data":"854acc0519462d93fa28d43f1a56a745d08970aa494e5a5da59bb1f9d62b9e12"} Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.004658 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.110850 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-scripts\") pod \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.110953 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-combined-ca-bundle\") pod \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.111017 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqbql\" (UniqueName: \"kubernetes.io/projected/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-kube-api-access-gqbql\") pod \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.111055 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-sg-core-conf-yaml\") pod \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.111086 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-run-httpd\") pod \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.111105 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-config-data\") pod \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.111163 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-log-httpd\") pod \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\" (UID: \"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b\") " Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.111925 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" (UID: "1fc03d6c-43f0-43f3-a912-6902a9e6fa1b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.112992 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" (UID: "1fc03d6c-43f0-43f3-a912-6902a9e6fa1b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.119189 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-scripts" (OuterVolumeSpecName: "scripts") pod "1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" (UID: "1fc03d6c-43f0-43f3-a912-6902a9e6fa1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.119663 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-kube-api-access-gqbql" (OuterVolumeSpecName: "kube-api-access-gqbql") pod "1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" (UID: "1fc03d6c-43f0-43f3-a912-6902a9e6fa1b"). InnerVolumeSpecName "kube-api-access-gqbql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.146535 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" (UID: "1fc03d6c-43f0-43f3-a912-6902a9e6fa1b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.213669 4867 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.213709 4867 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.213718 4867 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.213730 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.213738 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqbql\" (UniqueName: \"kubernetes.io/projected/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-kube-api-access-gqbql\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.270453 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" (UID: "1fc03d6c-43f0-43f3-a912-6902a9e6fa1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.280384 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-config-data" (OuterVolumeSpecName: "config-data") pod "1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" (UID: "1fc03d6c-43f0-43f3-a912-6902a9e6fa1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.316682 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.316711 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.625729 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1fc03d6c-43f0-43f3-a912-6902a9e6fa1b","Type":"ContainerDied","Data":"fbb39d98b51b35996a18779cf42f5737006cbb27c7e4d3c204968c0d9a6afed3"} Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.625752 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.625794 4867 scope.go:117] "RemoveContainer" containerID="ffebc9c3a6736a8e2cf8e7d47affbb8fc319a22d240cf117407fa934d86b14bb" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.628340 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b7620829-b468-470c-899e-92faea8bc3c7","Type":"ContainerStarted","Data":"3bffe59786e9156696962154708576e223f1e96292c94047c9b8105b8edce55c"} Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.708658 4867 scope.go:117] "RemoveContainer" containerID="02163191074d8e9cbe01ea8d0eace1b67524b94ab39d5fcbcb9d959cf49249bb" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.711865 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.728127192 podStartE2EDuration="15.711852843s" podCreationTimestamp="2025-10-06 13:22:51 +0000 UTC" firstStartedPulling="2025-10-06 13:22:52.719718039 +0000 UTC m=+1152.177666183" lastFinishedPulling="2025-10-06 13:23:05.70344369 +0000 UTC m=+1165.161391834" observedRunningTime="2025-10-06 13:23:06.652542501 +0000 UTC m=+1166.110490645" watchObservedRunningTime="2025-10-06 13:23:06.711852843 +0000 UTC m=+1166.169800987" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.721004 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.727759 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.744412 4867 scope.go:117] "RemoveContainer" containerID="109cc83caec02f53f1494c29e5c689f8c9d56b7b58d38eb20781fd574a6e1dbb" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.756463 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:06 crc kubenswrapper[4867]: E1006 13:23:06.756996 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="ceilometer-central-agent" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.757016 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="ceilometer-central-agent" Oct 06 13:23:06 crc kubenswrapper[4867]: E1006 13:23:06.757032 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="sg-core" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.757038 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="sg-core" Oct 06 13:23:06 crc kubenswrapper[4867]: E1006 13:23:06.757051 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="proxy-httpd" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.757058 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="proxy-httpd" Oct 06 13:23:06 crc kubenswrapper[4867]: E1006 13:23:06.757106 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="ceilometer-notification-agent" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.757113 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="ceilometer-notification-agent" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.757329 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="ceilometer-notification-agent" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.757344 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="proxy-httpd" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.757366 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="sg-core" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.757377 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" containerName="ceilometer-central-agent" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.759321 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.761930 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.763561 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.769042 4867 scope.go:117] "RemoveContainer" containerID="854acc0519462d93fa28d43f1a56a745d08970aa494e5a5da59bb1f9d62b9e12" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.773239 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.825879 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.825957 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-config-data\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.825993 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-run-httpd\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.826036 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-scripts\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.826077 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wvx2\" (UniqueName: \"kubernetes.io/projected/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-kube-api-access-6wvx2\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.826123 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-log-httpd\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.826146 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.927650 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.927720 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-config-data\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.927749 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-run-httpd\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.927803 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-scripts\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.927843 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wvx2\" (UniqueName: \"kubernetes.io/projected/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-kube-api-access-6wvx2\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.927892 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-log-httpd\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.927919 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.928374 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-log-httpd\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.928376 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-run-httpd\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.931974 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.934637 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-scripts\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.934916 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-config-data\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.944467 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:06 crc kubenswrapper[4867]: I1006 13:23:06.950608 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wvx2\" (UniqueName: \"kubernetes.io/projected/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-kube-api-access-6wvx2\") pod \"ceilometer-0\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " pod="openstack/ceilometer-0" Oct 06 13:23:07 crc kubenswrapper[4867]: I1006 13:23:07.119199 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:23:07 crc kubenswrapper[4867]: I1006 13:23:07.242663 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc03d6c-43f0-43f3-a912-6902a9e6fa1b" path="/var/lib/kubelet/pods/1fc03d6c-43f0-43f3-a912-6902a9e6fa1b/volumes" Oct 06 13:23:07 crc kubenswrapper[4867]: I1006 13:23:07.621994 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:07 crc kubenswrapper[4867]: I1006 13:23:07.672960 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1","Type":"ContainerStarted","Data":"11a46c734ae04429457cab2d1cc04581d7d46c0edf8cfe5a3586362c8929c3ed"} Oct 06 13:23:08 crc kubenswrapper[4867]: I1006 13:23:08.696547 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1","Type":"ContainerStarted","Data":"b286bc56cce9c42c76d389b171334f137a39a9a6f100b495cdd8f08609db2b04"} Oct 06 13:23:08 crc kubenswrapper[4867]: I1006 13:23:08.697629 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1","Type":"ContainerStarted","Data":"d00eaa345d512e0314d2771ecad331bc9aad80a6c8d115e3e2286255603fe18e"} Oct 06 13:23:08 crc kubenswrapper[4867]: I1006 13:23:08.974065 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 13:23:08 crc kubenswrapper[4867]: I1006 13:23:08.974665 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 06 13:23:08 crc kubenswrapper[4867]: I1006 13:23:08.975244 4867 scope.go:117] "RemoveContainer" containerID="890eb876f39d7213ab994e4f7878a7aa46933c45bc36e78c02b7645a4eba0da0" Oct 06 13:23:08 crc kubenswrapper[4867]: E1006 13:23:08.975687 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(c54babca-19bd-4a0b-a320-359b744ed066)\"" pod="openstack/watcher-decision-engine-0" podUID="c54babca-19bd-4a0b-a320-359b744ed066" Oct 06 13:23:09 crc kubenswrapper[4867]: I1006 13:23:09.708246 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1","Type":"ContainerStarted","Data":"5c7112879c97fc24a442813e4819ffed32104fc0e97031708eaa08d1dde93506"} Oct 06 13:23:09 crc kubenswrapper[4867]: I1006 13:23:09.708962 4867 scope.go:117] "RemoveContainer" containerID="890eb876f39d7213ab994e4f7878a7aa46933c45bc36e78c02b7645a4eba0da0" Oct 06 13:23:09 crc kubenswrapper[4867]: E1006 13:23:09.709364 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(c54babca-19bd-4a0b-a320-359b744ed066)\"" pod="openstack/watcher-decision-engine-0" podUID="c54babca-19bd-4a0b-a320-359b744ed066" Oct 06 13:23:10 crc kubenswrapper[4867]: I1006 13:23:10.719399 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1","Type":"ContainerStarted","Data":"265795a3c8c0ed1e1204f5df4a426713a00dbd975c6b0446847937ccaa9b005d"} Oct 06 13:23:10 crc kubenswrapper[4867]: I1006 13:23:10.719986 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 13:23:10 crc kubenswrapper[4867]: I1006 13:23:10.741685 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.06242541 podStartE2EDuration="4.741663348s" podCreationTimestamp="2025-10-06 13:23:06 +0000 UTC" firstStartedPulling="2025-10-06 13:23:07.628949748 +0000 UTC m=+1167.086897892" lastFinishedPulling="2025-10-06 13:23:10.308187676 +0000 UTC m=+1169.766135830" observedRunningTime="2025-10-06 13:23:10.739214751 +0000 UTC m=+1170.197162905" watchObservedRunningTime="2025-10-06 13:23:10.741663348 +0000 UTC m=+1170.199611492" Oct 06 13:23:10 crc kubenswrapper[4867]: W1006 13:23:10.964386 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fadf16c_2d2d_48bd_ab16_c2fac8c46b01.slice/crio-76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c.scope WatchSource:0}: Error finding container 76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c: Status 404 returned error can't find the container with id 76c37ad069d064602b170b9f184861e2099a898fadcec5e2cb682220bb765f9c Oct 06 13:23:10 crc kubenswrapper[4867]: W1006 13:23:10.965163 4867 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4dd974b_0132_491e_8254_26c144b9c7a9.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4dd974b_0132_491e_8254_26c144b9c7a9.slice: no such file or directory Oct 06 13:23:10 crc kubenswrapper[4867]: W1006 13:23:10.974797 4867 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fc03d6c_43f0_43f3_a912_6902a9e6fa1b.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fc03d6c_43f0_43f3_a912_6902a9e6fa1b.slice: no such file or directory Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.467927 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.538025 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77541c32-3bc1-402d-aa9f-924f9b6cb37f-config-data\") pod \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.538177 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvtnv\" (UniqueName: \"kubernetes.io/projected/77541c32-3bc1-402d-aa9f-924f9b6cb37f-kube-api-access-fvtnv\") pod \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.538297 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-horizon-secret-key\") pod \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.538360 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77541c32-3bc1-402d-aa9f-924f9b6cb37f-scripts\") pod \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.538381 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-combined-ca-bundle\") pod \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.538437 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-horizon-tls-certs\") pod \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.538477 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77541c32-3bc1-402d-aa9f-924f9b6cb37f-logs\") pod \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\" (UID: \"77541c32-3bc1-402d-aa9f-924f9b6cb37f\") " Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.539120 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77541c32-3bc1-402d-aa9f-924f9b6cb37f-logs" (OuterVolumeSpecName: "logs") pod "77541c32-3bc1-402d-aa9f-924f9b6cb37f" (UID: "77541c32-3bc1-402d-aa9f-924f9b6cb37f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.547622 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "77541c32-3bc1-402d-aa9f-924f9b6cb37f" (UID: "77541c32-3bc1-402d-aa9f-924f9b6cb37f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.550749 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77541c32-3bc1-402d-aa9f-924f9b6cb37f-kube-api-access-fvtnv" (OuterVolumeSpecName: "kube-api-access-fvtnv") pod "77541c32-3bc1-402d-aa9f-924f9b6cb37f" (UID: "77541c32-3bc1-402d-aa9f-924f9b6cb37f"). InnerVolumeSpecName "kube-api-access-fvtnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.570663 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77541c32-3bc1-402d-aa9f-924f9b6cb37f-config-data" (OuterVolumeSpecName: "config-data") pod "77541c32-3bc1-402d-aa9f-924f9b6cb37f" (UID: "77541c32-3bc1-402d-aa9f-924f9b6cb37f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.578172 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77541c32-3bc1-402d-aa9f-924f9b6cb37f" (UID: "77541c32-3bc1-402d-aa9f-924f9b6cb37f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.593464 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77541c32-3bc1-402d-aa9f-924f9b6cb37f-scripts" (OuterVolumeSpecName: "scripts") pod "77541c32-3bc1-402d-aa9f-924f9b6cb37f" (UID: "77541c32-3bc1-402d-aa9f-924f9b6cb37f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.614497 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "77541c32-3bc1-402d-aa9f-924f9b6cb37f" (UID: "77541c32-3bc1-402d-aa9f-924f9b6cb37f"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.640506 4867 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.640543 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77541c32-3bc1-402d-aa9f-924f9b6cb37f-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.640554 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77541c32-3bc1-402d-aa9f-924f9b6cb37f-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.640565 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvtnv\" (UniqueName: \"kubernetes.io/projected/77541c32-3bc1-402d-aa9f-924f9b6cb37f-kube-api-access-fvtnv\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.640576 4867 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.640584 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77541c32-3bc1-402d-aa9f-924f9b6cb37f-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.640593 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77541c32-3bc1-402d-aa9f-924f9b6cb37f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.730317 4867 generic.go:334] "Generic (PLEG): container finished" podID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" containerID="5730aa1cc45c4bf8647b15c0406023a7d7a198e5c1a015a4e43025386176f686" exitCode=137 Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.731460 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-577bfb968d-pw7pq" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.736440 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-577bfb968d-pw7pq" event={"ID":"77541c32-3bc1-402d-aa9f-924f9b6cb37f","Type":"ContainerDied","Data":"5730aa1cc45c4bf8647b15c0406023a7d7a198e5c1a015a4e43025386176f686"} Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.736530 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-577bfb968d-pw7pq" event={"ID":"77541c32-3bc1-402d-aa9f-924f9b6cb37f","Type":"ContainerDied","Data":"cbc9a756b5a13c59f64bb398bcaa677f88067b457d4449eb3e52985843e7d53d"} Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.736551 4867 scope.go:117] "RemoveContainer" containerID="8fe961a303c8c5c9ae9a747ad9fdc3e8cb64fefa9af8883e831275e26699c6e1" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.776015 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-577bfb968d-pw7pq"] Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.787618 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-577bfb968d-pw7pq"] Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.924028 4867 scope.go:117] "RemoveContainer" containerID="5730aa1cc45c4bf8647b15c0406023a7d7a198e5c1a015a4e43025386176f686" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.948694 4867 scope.go:117] "RemoveContainer" containerID="8fe961a303c8c5c9ae9a747ad9fdc3e8cb64fefa9af8883e831275e26699c6e1" Oct 06 13:23:11 crc kubenswrapper[4867]: E1006 13:23:11.949223 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe961a303c8c5c9ae9a747ad9fdc3e8cb64fefa9af8883e831275e26699c6e1\": container with ID starting with 8fe961a303c8c5c9ae9a747ad9fdc3e8cb64fefa9af8883e831275e26699c6e1 not found: ID does not exist" containerID="8fe961a303c8c5c9ae9a747ad9fdc3e8cb64fefa9af8883e831275e26699c6e1" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.949268 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe961a303c8c5c9ae9a747ad9fdc3e8cb64fefa9af8883e831275e26699c6e1"} err="failed to get container status \"8fe961a303c8c5c9ae9a747ad9fdc3e8cb64fefa9af8883e831275e26699c6e1\": rpc error: code = NotFound desc = could not find container \"8fe961a303c8c5c9ae9a747ad9fdc3e8cb64fefa9af8883e831275e26699c6e1\": container with ID starting with 8fe961a303c8c5c9ae9a747ad9fdc3e8cb64fefa9af8883e831275e26699c6e1 not found: ID does not exist" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.949290 4867 scope.go:117] "RemoveContainer" containerID="5730aa1cc45c4bf8647b15c0406023a7d7a198e5c1a015a4e43025386176f686" Oct 06 13:23:11 crc kubenswrapper[4867]: E1006 13:23:11.949652 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5730aa1cc45c4bf8647b15c0406023a7d7a198e5c1a015a4e43025386176f686\": container with ID starting with 5730aa1cc45c4bf8647b15c0406023a7d7a198e5c1a015a4e43025386176f686 not found: ID does not exist" containerID="5730aa1cc45c4bf8647b15c0406023a7d7a198e5c1a015a4e43025386176f686" Oct 06 13:23:11 crc kubenswrapper[4867]: I1006 13:23:11.949679 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5730aa1cc45c4bf8647b15c0406023a7d7a198e5c1a015a4e43025386176f686"} err="failed to get container status \"5730aa1cc45c4bf8647b15c0406023a7d7a198e5c1a015a4e43025386176f686\": rpc error: code = NotFound desc = could not find container \"5730aa1cc45c4bf8647b15c0406023a7d7a198e5c1a015a4e43025386176f686\": container with ID starting with 5730aa1cc45c4bf8647b15c0406023a7d7a198e5c1a015a4e43025386176f686 not found: ID does not exist" Oct 06 13:23:12 crc kubenswrapper[4867]: I1006 13:23:12.955004 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:12 crc kubenswrapper[4867]: I1006 13:23:12.955332 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="ceilometer-central-agent" containerID="cri-o://d00eaa345d512e0314d2771ecad331bc9aad80a6c8d115e3e2286255603fe18e" gracePeriod=30 Oct 06 13:23:12 crc kubenswrapper[4867]: I1006 13:23:12.955760 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="proxy-httpd" containerID="cri-o://265795a3c8c0ed1e1204f5df4a426713a00dbd975c6b0446847937ccaa9b005d" gracePeriod=30 Oct 06 13:23:12 crc kubenswrapper[4867]: I1006 13:23:12.955801 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="ceilometer-notification-agent" containerID="cri-o://b286bc56cce9c42c76d389b171334f137a39a9a6f100b495cdd8f08609db2b04" gracePeriod=30 Oct 06 13:23:12 crc kubenswrapper[4867]: I1006 13:23:12.956000 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="sg-core" containerID="cri-o://5c7112879c97fc24a442813e4819ffed32104fc0e97031708eaa08d1dde93506" gracePeriod=30 Oct 06 13:23:13 crc kubenswrapper[4867]: I1006 13:23:13.232895 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" path="/var/lib/kubelet/pods/77541c32-3bc1-402d-aa9f-924f9b6cb37f/volumes" Oct 06 13:23:13 crc kubenswrapper[4867]: I1006 13:23:13.755477 4867 generic.go:334] "Generic (PLEG): container finished" podID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerID="265795a3c8c0ed1e1204f5df4a426713a00dbd975c6b0446847937ccaa9b005d" exitCode=0 Oct 06 13:23:13 crc kubenswrapper[4867]: I1006 13:23:13.755530 4867 generic.go:334] "Generic (PLEG): container finished" podID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerID="5c7112879c97fc24a442813e4819ffed32104fc0e97031708eaa08d1dde93506" exitCode=2 Oct 06 13:23:13 crc kubenswrapper[4867]: I1006 13:23:13.755540 4867 generic.go:334] "Generic (PLEG): container finished" podID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerID="b286bc56cce9c42c76d389b171334f137a39a9a6f100b495cdd8f08609db2b04" exitCode=0 Oct 06 13:23:13 crc kubenswrapper[4867]: I1006 13:23:13.755563 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1","Type":"ContainerDied","Data":"265795a3c8c0ed1e1204f5df4a426713a00dbd975c6b0446847937ccaa9b005d"} Oct 06 13:23:13 crc kubenswrapper[4867]: I1006 13:23:13.755645 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1","Type":"ContainerDied","Data":"5c7112879c97fc24a442813e4819ffed32104fc0e97031708eaa08d1dde93506"} Oct 06 13:23:13 crc kubenswrapper[4867]: I1006 13:23:13.755663 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1","Type":"ContainerDied","Data":"b286bc56cce9c42c76d389b171334f137a39a9a6f100b495cdd8f08609db2b04"} Oct 06 13:23:14 crc kubenswrapper[4867]: I1006 13:23:14.772616 4867 generic.go:334] "Generic (PLEG): container finished" podID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerID="d00eaa345d512e0314d2771ecad331bc9aad80a6c8d115e3e2286255603fe18e" exitCode=0 Oct 06 13:23:14 crc kubenswrapper[4867]: I1006 13:23:14.772741 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1","Type":"ContainerDied","Data":"d00eaa345d512e0314d2771ecad331bc9aad80a6c8d115e3e2286255603fe18e"} Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.007387 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.112245 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-run-httpd\") pod \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.112370 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-combined-ca-bundle\") pod \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.112870 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" (UID: "55edb26b-d97b-4533-92a7-0ecf1b5ef2d1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.113132 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-log-httpd\") pod \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.113297 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-scripts\") pod \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.113363 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wvx2\" (UniqueName: \"kubernetes.io/projected/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-kube-api-access-6wvx2\") pod \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.113440 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-config-data\") pod \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.113470 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-sg-core-conf-yaml\") pod \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\" (UID: \"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1\") " Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.113489 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" (UID: "55edb26b-d97b-4533-92a7-0ecf1b5ef2d1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.113998 4867 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.114329 4867 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.119885 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-scripts" (OuterVolumeSpecName: "scripts") pod "55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" (UID: "55edb26b-d97b-4533-92a7-0ecf1b5ef2d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.127460 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-kube-api-access-6wvx2" (OuterVolumeSpecName: "kube-api-access-6wvx2") pod "55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" (UID: "55edb26b-d97b-4533-92a7-0ecf1b5ef2d1"). InnerVolumeSpecName "kube-api-access-6wvx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.144810 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" (UID: "55edb26b-d97b-4533-92a7-0ecf1b5ef2d1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.213820 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" (UID: "55edb26b-d97b-4533-92a7-0ecf1b5ef2d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.217178 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.217425 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.217510 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wvx2\" (UniqueName: \"kubernetes.io/projected/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-kube-api-access-6wvx2\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.217598 4867 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.249960 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-config-data" (OuterVolumeSpecName: "config-data") pod "55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" (UID: "55edb26b-d97b-4533-92a7-0ecf1b5ef2d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.319054 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.787812 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55edb26b-d97b-4533-92a7-0ecf1b5ef2d1","Type":"ContainerDied","Data":"11a46c734ae04429457cab2d1cc04581d7d46c0edf8cfe5a3586362c8929c3ed"} Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.787884 4867 scope.go:117] "RemoveContainer" containerID="265795a3c8c0ed1e1204f5df4a426713a00dbd975c6b0446847937ccaa9b005d" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.787890 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.836339 4867 scope.go:117] "RemoveContainer" containerID="5c7112879c97fc24a442813e4819ffed32104fc0e97031708eaa08d1dde93506" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.876947 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.909237 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-mvq2j"] Oct 06 13:23:15 crc kubenswrapper[4867]: E1006 13:23:15.909623 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="proxy-httpd" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.909634 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="proxy-httpd" Oct 06 13:23:15 crc kubenswrapper[4867]: E1006 13:23:15.909655 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="ceilometer-central-agent" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.909662 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="ceilometer-central-agent" Oct 06 13:23:15 crc kubenswrapper[4867]: E1006 13:23:15.909669 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" containerName="horizon-log" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.909675 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" containerName="horizon-log" Oct 06 13:23:15 crc kubenswrapper[4867]: E1006 13:23:15.909686 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="sg-core" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.909692 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="sg-core" Oct 06 13:23:15 crc kubenswrapper[4867]: E1006 13:23:15.909705 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" containerName="horizon" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.909721 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" containerName="horizon" Oct 06 13:23:15 crc kubenswrapper[4867]: E1006 13:23:15.909751 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="ceilometer-notification-agent" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.909757 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="ceilometer-notification-agent" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.909922 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="proxy-httpd" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.909967 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="ceilometer-central-agent" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.909976 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="sg-core" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.909987 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" containerName="horizon-log" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.910003 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="77541c32-3bc1-402d-aa9f-924f9b6cb37f" containerName="horizon" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.910018 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" containerName="ceilometer-notification-agent" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.910649 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mvq2j" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.921150 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.932999 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjbj5\" (UniqueName: \"kubernetes.io/projected/3f1a49ca-4178-4e3b-958e-a766b5087e59-kube-api-access-bjbj5\") pod \"nova-api-db-create-mvq2j\" (UID: \"3f1a49ca-4178-4e3b-958e-a766b5087e59\") " pod="openstack/nova-api-db-create-mvq2j" Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.972521 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mvq2j"] Oct 06 13:23:15 crc kubenswrapper[4867]: I1006 13:23:15.980579 4867 scope.go:117] "RemoveContainer" containerID="b286bc56cce9c42c76d389b171334f137a39a9a6f100b495cdd8f08609db2b04" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:15.999327 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.007281 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.014044 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.014663 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.038779 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-config-data\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.039155 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjbj5\" (UniqueName: \"kubernetes.io/projected/3f1a49ca-4178-4e3b-958e-a766b5087e59-kube-api-access-bjbj5\") pod \"nova-api-db-create-mvq2j\" (UID: \"3f1a49ca-4178-4e3b-958e-a766b5087e59\") " pod="openstack/nova-api-db-create-mvq2j" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.039180 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-scripts\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.039199 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d81f192-aeec-4482-abbd-35e95040a432-log-httpd\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.039216 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d81f192-aeec-4482-abbd-35e95040a432-run-httpd\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.039234 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.039282 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj9wm\" (UniqueName: \"kubernetes.io/projected/4d81f192-aeec-4482-abbd-35e95040a432-kube-api-access-lj9wm\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.039306 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.040981 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.095610 4867 scope.go:117] "RemoveContainer" containerID="d00eaa345d512e0314d2771ecad331bc9aad80a6c8d115e3e2286255603fe18e" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.117055 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjbj5\" (UniqueName: \"kubernetes.io/projected/3f1a49ca-4178-4e3b-958e-a766b5087e59-kube-api-access-bjbj5\") pod \"nova-api-db-create-mvq2j\" (UID: \"3f1a49ca-4178-4e3b-958e-a766b5087e59\") " pod="openstack/nova-api-db-create-mvq2j" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.161291 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-xpp6d"] Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.162760 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xpp6d" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.173517 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-scripts\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.173566 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d81f192-aeec-4482-abbd-35e95040a432-log-httpd\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.173590 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d81f192-aeec-4482-abbd-35e95040a432-run-httpd\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.173607 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.173648 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj9wm\" (UniqueName: \"kubernetes.io/projected/4d81f192-aeec-4482-abbd-35e95040a432-kube-api-access-lj9wm\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.173679 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.173762 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-config-data\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.174821 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d81f192-aeec-4482-abbd-35e95040a432-run-httpd\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.177662 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d81f192-aeec-4482-abbd-35e95040a432-log-httpd\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.214334 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xpp6d"] Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.219639 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.220371 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-config-data\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.220818 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-scripts\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.221230 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.225874 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj9wm\" (UniqueName: \"kubernetes.io/projected/4d81f192-aeec-4482-abbd-35e95040a432-kube-api-access-lj9wm\") pod \"ceilometer-0\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.276587 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsbsq\" (UniqueName: \"kubernetes.io/projected/7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8-kube-api-access-qsbsq\") pod \"nova-cell0-db-create-xpp6d\" (UID: \"7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8\") " pod="openstack/nova-cell0-db-create-xpp6d" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.283511 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-jxwfj"] Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.284855 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jxwfj" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.293791 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mvq2j" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.307931 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jxwfj"] Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.396226 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsbsq\" (UniqueName: \"kubernetes.io/projected/7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8-kube-api-access-qsbsq\") pod \"nova-cell0-db-create-xpp6d\" (UID: \"7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8\") " pod="openstack/nova-cell0-db-create-xpp6d" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.396450 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xthj7\" (UniqueName: \"kubernetes.io/projected/cb523789-87d3-4cd0-8047-a431894a91ff-kube-api-access-xthj7\") pod \"nova-cell1-db-create-jxwfj\" (UID: \"cb523789-87d3-4cd0-8047-a431894a91ff\") " pod="openstack/nova-cell1-db-create-jxwfj" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.399820 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.400073 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f72b9a47-8845-4870-9e50-d15fa17e2db4" containerName="glance-log" containerID="cri-o://da6dcfbcb5bf5b7cc561c00c394063c697c4ea8db4cb00b224323ef2bbe369c9" gracePeriod=30 Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.400539 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f72b9a47-8845-4870-9e50-d15fa17e2db4" containerName="glance-httpd" containerID="cri-o://95703887daf9ae67a46b2677bc3b3b2f68fc6675b5f947312336d5039a06c32b" gracePeriod=30 Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.429165 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsbsq\" (UniqueName: \"kubernetes.io/projected/7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8-kube-api-access-qsbsq\") pod \"nova-cell0-db-create-xpp6d\" (UID: \"7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8\") " pod="openstack/nova-cell0-db-create-xpp6d" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.453149 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.499649 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xthj7\" (UniqueName: \"kubernetes.io/projected/cb523789-87d3-4cd0-8047-a431894a91ff-kube-api-access-xthj7\") pod \"nova-cell1-db-create-jxwfj\" (UID: \"cb523789-87d3-4cd0-8047-a431894a91ff\") " pod="openstack/nova-cell1-db-create-jxwfj" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.543559 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xthj7\" (UniqueName: \"kubernetes.io/projected/cb523789-87d3-4cd0-8047-a431894a91ff-kube-api-access-xthj7\") pod \"nova-cell1-db-create-jxwfj\" (UID: \"cb523789-87d3-4cd0-8047-a431894a91ff\") " pod="openstack/nova-cell1-db-create-jxwfj" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.609693 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xpp6d" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.621612 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jxwfj" Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.874238 4867 generic.go:334] "Generic (PLEG): container finished" podID="f72b9a47-8845-4870-9e50-d15fa17e2db4" containerID="da6dcfbcb5bf5b7cc561c00c394063c697c4ea8db4cb00b224323ef2bbe369c9" exitCode=143 Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.874313 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f72b9a47-8845-4870-9e50-d15fa17e2db4","Type":"ContainerDied","Data":"da6dcfbcb5bf5b7cc561c00c394063c697c4ea8db4cb00b224323ef2bbe369c9"} Oct 06 13:23:16 crc kubenswrapper[4867]: I1006 13:23:16.941427 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mvq2j"] Oct 06 13:23:17 crc kubenswrapper[4867]: I1006 13:23:17.160754 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:17 crc kubenswrapper[4867]: W1006 13:23:17.164304 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d81f192_aeec_4482_abbd_35e95040a432.slice/crio-66241a60969f3769146e136cef7b5b92d3e568ad2ed4f74ccf2ca9a5af328f8d WatchSource:0}: Error finding container 66241a60969f3769146e136cef7b5b92d3e568ad2ed4f74ccf2ca9a5af328f8d: Status 404 returned error can't find the container with id 66241a60969f3769146e136cef7b5b92d3e568ad2ed4f74ccf2ca9a5af328f8d Oct 06 13:23:17 crc kubenswrapper[4867]: I1006 13:23:17.249318 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55edb26b-d97b-4533-92a7-0ecf1b5ef2d1" path="/var/lib/kubelet/pods/55edb26b-d97b-4533-92a7-0ecf1b5ef2d1/volumes" Oct 06 13:23:17 crc kubenswrapper[4867]: I1006 13:23:17.427127 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jxwfj"] Oct 06 13:23:17 crc kubenswrapper[4867]: W1006 13:23:17.517620 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7efcdadc_ad8e_4c32_8c7d_7c3e387d2cc8.slice/crio-f77906a1e959d9ed99b6a82c9cbc896df3e9bbcdae678543cca64e79bdfe9dc2 WatchSource:0}: Error finding container f77906a1e959d9ed99b6a82c9cbc896df3e9bbcdae678543cca64e79bdfe9dc2: Status 404 returned error can't find the container with id f77906a1e959d9ed99b6a82c9cbc896df3e9bbcdae678543cca64e79bdfe9dc2 Oct 06 13:23:17 crc kubenswrapper[4867]: I1006 13:23:17.522364 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xpp6d"] Oct 06 13:23:17 crc kubenswrapper[4867]: I1006 13:23:17.888163 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d81f192-aeec-4482-abbd-35e95040a432","Type":"ContainerStarted","Data":"66241a60969f3769146e136cef7b5b92d3e568ad2ed4f74ccf2ca9a5af328f8d"} Oct 06 13:23:17 crc kubenswrapper[4867]: I1006 13:23:17.889926 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xpp6d" event={"ID":"7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8","Type":"ContainerStarted","Data":"f77906a1e959d9ed99b6a82c9cbc896df3e9bbcdae678543cca64e79bdfe9dc2"} Oct 06 13:23:17 crc kubenswrapper[4867]: I1006 13:23:17.891602 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mvq2j" event={"ID":"3f1a49ca-4178-4e3b-958e-a766b5087e59","Type":"ContainerStarted","Data":"90cbb1aaa578cf9e967c8cb2266ed24edea10f2e46f0ff6e7345765e44639e00"} Oct 06 13:23:17 crc kubenswrapper[4867]: I1006 13:23:17.892806 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jxwfj" event={"ID":"cb523789-87d3-4cd0-8047-a431894a91ff","Type":"ContainerStarted","Data":"90b15f66fb13a91ebcbc6b3143778b083bc75bccb0cd63d8510671965a70e4e4"} Oct 06 13:23:18 crc kubenswrapper[4867]: I1006 13:23:18.490504 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:23:18 crc kubenswrapper[4867]: I1006 13:23:18.490756 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" containerName="glance-log" containerID="cri-o://5a2464d342dc52933b419004a8adfd30b17e43f6e98925653f12539ac7441d90" gracePeriod=30 Oct 06 13:23:18 crc kubenswrapper[4867]: I1006 13:23:18.490857 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" containerName="glance-httpd" containerID="cri-o://7cf8d542f1c6872b9f598e1ee3f2987f47546194cc0de293df9beb7b4fa9711c" gracePeriod=30 Oct 06 13:23:18 crc kubenswrapper[4867]: I1006 13:23:18.911614 4867 generic.go:334] "Generic (PLEG): container finished" podID="f72b9a47-8845-4870-9e50-d15fa17e2db4" containerID="95703887daf9ae67a46b2677bc3b3b2f68fc6675b5f947312336d5039a06c32b" exitCode=0 Oct 06 13:23:18 crc kubenswrapper[4867]: I1006 13:23:18.911737 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f72b9a47-8845-4870-9e50-d15fa17e2db4","Type":"ContainerDied","Data":"95703887daf9ae67a46b2677bc3b3b2f68fc6675b5f947312336d5039a06c32b"} Oct 06 13:23:18 crc kubenswrapper[4867]: I1006 13:23:18.917369 4867 generic.go:334] "Generic (PLEG): container finished" podID="7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8" containerID="f04328fce3f0e8db1446137312663ce22cfd5858142bb839f1b6de6c54f1c9cf" exitCode=0 Oct 06 13:23:18 crc kubenswrapper[4867]: I1006 13:23:18.917478 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xpp6d" event={"ID":"7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8","Type":"ContainerDied","Data":"f04328fce3f0e8db1446137312663ce22cfd5858142bb839f1b6de6c54f1c9cf"} Oct 06 13:23:18 crc kubenswrapper[4867]: I1006 13:23:18.919236 4867 generic.go:334] "Generic (PLEG): container finished" podID="3f1a49ca-4178-4e3b-958e-a766b5087e59" containerID="d188d39a7decbb3c975c7d397a2dcf4525b0f90cf4b59da34e5014a956f8f055" exitCode=0 Oct 06 13:23:18 crc kubenswrapper[4867]: I1006 13:23:18.919296 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mvq2j" event={"ID":"3f1a49ca-4178-4e3b-958e-a766b5087e59","Type":"ContainerDied","Data":"d188d39a7decbb3c975c7d397a2dcf4525b0f90cf4b59da34e5014a956f8f055"} Oct 06 13:23:18 crc kubenswrapper[4867]: I1006 13:23:18.931617 4867 generic.go:334] "Generic (PLEG): container finished" podID="cb523789-87d3-4cd0-8047-a431894a91ff" containerID="b477480223cd8714c5f5d8ea9f8597383152d9544938c31b2d4e72bb88acb43b" exitCode=0 Oct 06 13:23:18 crc kubenswrapper[4867]: I1006 13:23:18.931789 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jxwfj" event={"ID":"cb523789-87d3-4cd0-8047-a431894a91ff","Type":"ContainerDied","Data":"b477480223cd8714c5f5d8ea9f8597383152d9544938c31b2d4e72bb88acb43b"} Oct 06 13:23:18 crc kubenswrapper[4867]: I1006 13:23:18.942936 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" containerID="5a2464d342dc52933b419004a8adfd30b17e43f6e98925653f12539ac7441d90" exitCode=143 Oct 06 13:23:18 crc kubenswrapper[4867]: I1006 13:23:18.942981 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae","Type":"ContainerDied","Data":"5a2464d342dc52933b419004a8adfd30b17e43f6e98925653f12539ac7441d90"} Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.330005 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.414148 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f72b9a47-8845-4870-9e50-d15fa17e2db4-logs\") pod \"f72b9a47-8845-4870-9e50-d15fa17e2db4\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.414301 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-config-data\") pod \"f72b9a47-8845-4870-9e50-d15fa17e2db4\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.414403 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-scripts\") pod \"f72b9a47-8845-4870-9e50-d15fa17e2db4\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.414477 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f72b9a47-8845-4870-9e50-d15fa17e2db4-httpd-run\") pod \"f72b9a47-8845-4870-9e50-d15fa17e2db4\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.415080 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f72b9a47-8845-4870-9e50-d15fa17e2db4-logs" (OuterVolumeSpecName: "logs") pod "f72b9a47-8845-4870-9e50-d15fa17e2db4" (UID: "f72b9a47-8845-4870-9e50-d15fa17e2db4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.415272 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tlnv\" (UniqueName: \"kubernetes.io/projected/f72b9a47-8845-4870-9e50-d15fa17e2db4-kube-api-access-2tlnv\") pod \"f72b9a47-8845-4870-9e50-d15fa17e2db4\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.415305 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-combined-ca-bundle\") pod \"f72b9a47-8845-4870-9e50-d15fa17e2db4\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.415372 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"f72b9a47-8845-4870-9e50-d15fa17e2db4\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.415398 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-public-tls-certs\") pod \"f72b9a47-8845-4870-9e50-d15fa17e2db4\" (UID: \"f72b9a47-8845-4870-9e50-d15fa17e2db4\") " Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.415534 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f72b9a47-8845-4870-9e50-d15fa17e2db4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f72b9a47-8845-4870-9e50-d15fa17e2db4" (UID: "f72b9a47-8845-4870-9e50-d15fa17e2db4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.416124 4867 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f72b9a47-8845-4870-9e50-d15fa17e2db4-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.416141 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f72b9a47-8845-4870-9e50-d15fa17e2db4-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.420207 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-scripts" (OuterVolumeSpecName: "scripts") pod "f72b9a47-8845-4870-9e50-d15fa17e2db4" (UID: "f72b9a47-8845-4870-9e50-d15fa17e2db4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.424471 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f72b9a47-8845-4870-9e50-d15fa17e2db4-kube-api-access-2tlnv" (OuterVolumeSpecName: "kube-api-access-2tlnv") pod "f72b9a47-8845-4870-9e50-d15fa17e2db4" (UID: "f72b9a47-8845-4870-9e50-d15fa17e2db4"). InnerVolumeSpecName "kube-api-access-2tlnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.428453 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "f72b9a47-8845-4870-9e50-d15fa17e2db4" (UID: "f72b9a47-8845-4870-9e50-d15fa17e2db4"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.465636 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f72b9a47-8845-4870-9e50-d15fa17e2db4" (UID: "f72b9a47-8845-4870-9e50-d15fa17e2db4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.481416 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-config-data" (OuterVolumeSpecName: "config-data") pod "f72b9a47-8845-4870-9e50-d15fa17e2db4" (UID: "f72b9a47-8845-4870-9e50-d15fa17e2db4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.488593 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f72b9a47-8845-4870-9e50-d15fa17e2db4" (UID: "f72b9a47-8845-4870-9e50-d15fa17e2db4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.518860 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.518905 4867 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.518948 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.518961 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.518973 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tlnv\" (UniqueName: \"kubernetes.io/projected/f72b9a47-8845-4870-9e50-d15fa17e2db4-kube-api-access-2tlnv\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.518986 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f72b9a47-8845-4870-9e50-d15fa17e2db4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.540219 4867 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.619682 4867 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.957606 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d81f192-aeec-4482-abbd-35e95040a432","Type":"ContainerStarted","Data":"534513f162e964710d094492273df846eaebf1265ad97f8532cbe5d57e4af087"} Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.957687 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d81f192-aeec-4482-abbd-35e95040a432","Type":"ContainerStarted","Data":"73d860080112e2c44b7c9d4542e43cc2bc5d7db777b34d8c556361a9681aab36"} Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.960443 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f72b9a47-8845-4870-9e50-d15fa17e2db4","Type":"ContainerDied","Data":"2b78342651932d1ae0b7393d057d9ea393c0e470c05b145d014c30963bbd6a93"} Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.960521 4867 scope.go:117] "RemoveContainer" containerID="95703887daf9ae67a46b2677bc3b3b2f68fc6675b5f947312336d5039a06c32b" Oct 06 13:23:19 crc kubenswrapper[4867]: I1006 13:23:19.960741 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.004965 4867 scope.go:117] "RemoveContainer" containerID="da6dcfbcb5bf5b7cc561c00c394063c697c4ea8db4cb00b224323ef2bbe369c9" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.022346 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.029697 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.039189 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:23:20 crc kubenswrapper[4867]: E1006 13:23:20.039596 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f72b9a47-8845-4870-9e50-d15fa17e2db4" containerName="glance-log" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.039611 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f72b9a47-8845-4870-9e50-d15fa17e2db4" containerName="glance-log" Oct 06 13:23:20 crc kubenswrapper[4867]: E1006 13:23:20.039644 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f72b9a47-8845-4870-9e50-d15fa17e2db4" containerName="glance-httpd" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.039650 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f72b9a47-8845-4870-9e50-d15fa17e2db4" containerName="glance-httpd" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.039833 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f72b9a47-8845-4870-9e50-d15fa17e2db4" containerName="glance-httpd" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.039865 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f72b9a47-8845-4870-9e50-d15fa17e2db4" containerName="glance-log" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.041274 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.045224 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.045472 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.066964 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.235305 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbe30dd-179b-4e2b-b011-b395c30e32a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.235392 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.235453 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbbe30dd-179b-4e2b-b011-b395c30e32a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.235486 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbbe30dd-179b-4e2b-b011-b395c30e32a9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.235509 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbe30dd-179b-4e2b-b011-b395c30e32a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.235558 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v676\" (UniqueName: \"kubernetes.io/projected/fbbe30dd-179b-4e2b-b011-b395c30e32a9-kube-api-access-4v676\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.235585 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbe30dd-179b-4e2b-b011-b395c30e32a9-logs\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.235600 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbbe30dd-179b-4e2b-b011-b395c30e32a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.337376 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v676\" (UniqueName: \"kubernetes.io/projected/fbbe30dd-179b-4e2b-b011-b395c30e32a9-kube-api-access-4v676\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.337765 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbe30dd-179b-4e2b-b011-b395c30e32a9-logs\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.337787 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbbe30dd-179b-4e2b-b011-b395c30e32a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.337868 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbe30dd-179b-4e2b-b011-b395c30e32a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.337923 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.337973 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbbe30dd-179b-4e2b-b011-b395c30e32a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.338001 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbbe30dd-179b-4e2b-b011-b395c30e32a9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.338017 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbe30dd-179b-4e2b-b011-b395c30e32a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.339403 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbbe30dd-179b-4e2b-b011-b395c30e32a9-logs\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.340193 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbbe30dd-179b-4e2b-b011-b395c30e32a9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.340731 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.355765 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v676\" (UniqueName: \"kubernetes.io/projected/fbbe30dd-179b-4e2b-b011-b395c30e32a9-kube-api-access-4v676\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.357596 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbbe30dd-179b-4e2b-b011-b395c30e32a9-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.358942 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbbe30dd-179b-4e2b-b011-b395c30e32a9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.359121 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbbe30dd-179b-4e2b-b011-b395c30e32a9-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.361083 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbbe30dd-179b-4e2b-b011-b395c30e32a9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.451835 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"fbbe30dd-179b-4e2b-b011-b395c30e32a9\") " pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.509369 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jxwfj" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.655113 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xthj7\" (UniqueName: \"kubernetes.io/projected/cb523789-87d3-4cd0-8047-a431894a91ff-kube-api-access-xthj7\") pod \"cb523789-87d3-4cd0-8047-a431894a91ff\" (UID: \"cb523789-87d3-4cd0-8047-a431894a91ff\") " Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.670400 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb523789-87d3-4cd0-8047-a431894a91ff-kube-api-access-xthj7" (OuterVolumeSpecName: "kube-api-access-xthj7") pod "cb523789-87d3-4cd0-8047-a431894a91ff" (UID: "cb523789-87d3-4cd0-8047-a431894a91ff"). InnerVolumeSpecName "kube-api-access-xthj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.675751 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.770013 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xthj7\" (UniqueName: \"kubernetes.io/projected/cb523789-87d3-4cd0-8047-a431894a91ff-kube-api-access-xthj7\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.859691 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xpp6d" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.862113 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mvq2j" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.881635 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjbj5\" (UniqueName: \"kubernetes.io/projected/3f1a49ca-4178-4e3b-958e-a766b5087e59-kube-api-access-bjbj5\") pod \"3f1a49ca-4178-4e3b-958e-a766b5087e59\" (UID: \"3f1a49ca-4178-4e3b-958e-a766b5087e59\") " Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.881730 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsbsq\" (UniqueName: \"kubernetes.io/projected/7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8-kube-api-access-qsbsq\") pod \"7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8\" (UID: \"7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8\") " Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.890454 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8-kube-api-access-qsbsq" (OuterVolumeSpecName: "kube-api-access-qsbsq") pod "7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8" (UID: "7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8"). InnerVolumeSpecName "kube-api-access-qsbsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.891474 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f1a49ca-4178-4e3b-958e-a766b5087e59-kube-api-access-bjbj5" (OuterVolumeSpecName: "kube-api-access-bjbj5") pod "3f1a49ca-4178-4e3b-958e-a766b5087e59" (UID: "3f1a49ca-4178-4e3b-958e-a766b5087e59"). InnerVolumeSpecName "kube-api-access-bjbj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.981524 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xpp6d" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.981733 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xpp6d" event={"ID":"7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8","Type":"ContainerDied","Data":"f77906a1e959d9ed99b6a82c9cbc896df3e9bbcdae678543cca64e79bdfe9dc2"} Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.981776 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f77906a1e959d9ed99b6a82c9cbc896df3e9bbcdae678543cca64e79bdfe9dc2" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.984408 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsbsq\" (UniqueName: \"kubernetes.io/projected/7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8-kube-api-access-qsbsq\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.984446 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjbj5\" (UniqueName: \"kubernetes.io/projected/3f1a49ca-4178-4e3b-958e-a766b5087e59-kube-api-access-bjbj5\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.994962 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mvq2j" event={"ID":"3f1a49ca-4178-4e3b-958e-a766b5087e59","Type":"ContainerDied","Data":"90cbb1aaa578cf9e967c8cb2266ed24edea10f2e46f0ff6e7345765e44639e00"} Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.995015 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90cbb1aaa578cf9e967c8cb2266ed24edea10f2e46f0ff6e7345765e44639e00" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.995092 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mvq2j" Oct 06 13:23:20 crc kubenswrapper[4867]: I1006 13:23:20.996960 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.018137 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jxwfj" event={"ID":"cb523789-87d3-4cd0-8047-a431894a91ff","Type":"ContainerDied","Data":"90b15f66fb13a91ebcbc6b3143778b083bc75bccb0cd63d8510671965a70e4e4"} Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.018546 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90b15f66fb13a91ebcbc6b3143778b083bc75bccb0cd63d8510671965a70e4e4" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.018618 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jxwfj" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.047309 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d81f192-aeec-4482-abbd-35e95040a432","Type":"ContainerStarted","Data":"c3a5c5a1c4e7ce5f1b6392f844eefca790f98b6f2891d03c051696fa5318861e"} Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.071159 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" containerID="7cf8d542f1c6872b9f598e1ee3f2987f47546194cc0de293df9beb7b4fa9711c" exitCode=0 Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.071220 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae","Type":"ContainerDied","Data":"7cf8d542f1c6872b9f598e1ee3f2987f47546194cc0de293df9beb7b4fa9711c"} Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.071267 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae","Type":"ContainerDied","Data":"fe510d92f245fcfa5be83f61e662a0fd41864204a507f146e1990cea1622ea37"} Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.071287 4867 scope.go:117] "RemoveContainer" containerID="7cf8d542f1c6872b9f598e1ee3f2987f47546194cc0de293df9beb7b4fa9711c" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.071446 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.085543 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-combined-ca-bundle\") pod \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.085587 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-internal-tls-certs\") pod \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.085753 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-logs\") pod \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.085875 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-httpd-run\") pod \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.086049 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-config-data\") pod \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.086123 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsqlv\" (UniqueName: \"kubernetes.io/projected/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-kube-api-access-wsqlv\") pod \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.086152 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.086172 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-scripts\") pod \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\" (UID: \"8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae\") " Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.088184 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-logs" (OuterVolumeSpecName: "logs") pod "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" (UID: "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.088492 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" (UID: "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.092828 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-kube-api-access-wsqlv" (OuterVolumeSpecName: "kube-api-access-wsqlv") pod "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" (UID: "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae"). InnerVolumeSpecName "kube-api-access-wsqlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.095423 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" (UID: "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.100418 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-scripts" (OuterVolumeSpecName: "scripts") pod "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" (UID: "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.114821 4867 scope.go:117] "RemoveContainer" containerID="5a2464d342dc52933b419004a8adfd30b17e43f6e98925653f12539ac7441d90" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.127622 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" (UID: "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.150496 4867 scope.go:117] "RemoveContainer" containerID="7cf8d542f1c6872b9f598e1ee3f2987f47546194cc0de293df9beb7b4fa9711c" Oct 06 13:23:21 crc kubenswrapper[4867]: E1006 13:23:21.155091 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cf8d542f1c6872b9f598e1ee3f2987f47546194cc0de293df9beb7b4fa9711c\": container with ID starting with 7cf8d542f1c6872b9f598e1ee3f2987f47546194cc0de293df9beb7b4fa9711c not found: ID does not exist" containerID="7cf8d542f1c6872b9f598e1ee3f2987f47546194cc0de293df9beb7b4fa9711c" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.155140 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cf8d542f1c6872b9f598e1ee3f2987f47546194cc0de293df9beb7b4fa9711c"} err="failed to get container status \"7cf8d542f1c6872b9f598e1ee3f2987f47546194cc0de293df9beb7b4fa9711c\": rpc error: code = NotFound desc = could not find container \"7cf8d542f1c6872b9f598e1ee3f2987f47546194cc0de293df9beb7b4fa9711c\": container with ID starting with 7cf8d542f1c6872b9f598e1ee3f2987f47546194cc0de293df9beb7b4fa9711c not found: ID does not exist" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.155177 4867 scope.go:117] "RemoveContainer" containerID="5a2464d342dc52933b419004a8adfd30b17e43f6e98925653f12539ac7441d90" Oct 06 13:23:21 crc kubenswrapper[4867]: E1006 13:23:21.155572 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a2464d342dc52933b419004a8adfd30b17e43f6e98925653f12539ac7441d90\": container with ID starting with 5a2464d342dc52933b419004a8adfd30b17e43f6e98925653f12539ac7441d90 not found: ID does not exist" containerID="5a2464d342dc52933b419004a8adfd30b17e43f6e98925653f12539ac7441d90" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.155609 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a2464d342dc52933b419004a8adfd30b17e43f6e98925653f12539ac7441d90"} err="failed to get container status \"5a2464d342dc52933b419004a8adfd30b17e43f6e98925653f12539ac7441d90\": rpc error: code = NotFound desc = could not find container \"5a2464d342dc52933b419004a8adfd30b17e43f6e98925653f12539ac7441d90\": container with ID starting with 5a2464d342dc52933b419004a8adfd30b17e43f6e98925653f12539ac7441d90 not found: ID does not exist" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.166824 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-config-data" (OuterVolumeSpecName: "config-data") pod "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" (UID: "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.182334 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" (UID: "8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.191616 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.191669 4867 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.191683 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.191696 4867 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.191706 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.191716 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsqlv\" (UniqueName: \"kubernetes.io/projected/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-kube-api-access-wsqlv\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.191753 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.191762 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.234503 4867 scope.go:117] "RemoveContainer" containerID="890eb876f39d7213ab994e4f7878a7aa46933c45bc36e78c02b7645a4eba0da0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.267881 4867 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.278045 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f72b9a47-8845-4870-9e50-d15fa17e2db4" path="/var/lib/kubelet/pods/f72b9a47-8845-4870-9e50-d15fa17e2db4/volumes" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.294899 4867 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.354605 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 13:23:21 crc kubenswrapper[4867]: W1006 13:23:21.360627 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbbe30dd_179b_4e2b_b011_b395c30e32a9.slice/crio-d26729930a36287c5f82b80dda0d7ade8afe031f0c36de6a05874e992a3c6b13 WatchSource:0}: Error finding container d26729930a36287c5f82b80dda0d7ade8afe031f0c36de6a05874e992a3c6b13: Status 404 returned error can't find the container with id d26729930a36287c5f82b80dda0d7ade8afe031f0c36de6a05874e992a3c6b13 Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.368149 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.415679 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.439922 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.477943 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:23:21 crc kubenswrapper[4867]: E1006 13:23:21.479778 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f1a49ca-4178-4e3b-958e-a766b5087e59" containerName="mariadb-database-create" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.479878 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f1a49ca-4178-4e3b-958e-a766b5087e59" containerName="mariadb-database-create" Oct 06 13:23:21 crc kubenswrapper[4867]: E1006 13:23:21.479958 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" containerName="glance-httpd" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.480170 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" containerName="glance-httpd" Oct 06 13:23:21 crc kubenswrapper[4867]: E1006 13:23:21.480353 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8" containerName="mariadb-database-create" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.480697 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8" containerName="mariadb-database-create" Oct 06 13:23:21 crc kubenswrapper[4867]: E1006 13:23:21.480996 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" containerName="glance-log" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.481083 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" containerName="glance-log" Oct 06 13:23:21 crc kubenswrapper[4867]: E1006 13:23:21.481164 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb523789-87d3-4cd0-8047-a431894a91ff" containerName="mariadb-database-create" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.481235 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb523789-87d3-4cd0-8047-a431894a91ff" containerName="mariadb-database-create" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.481847 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" containerName="glance-log" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.482020 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" containerName="glance-httpd" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.482123 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8" containerName="mariadb-database-create" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.482239 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb523789-87d3-4cd0-8047-a431894a91ff" containerName="mariadb-database-create" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.482350 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f1a49ca-4178-4e3b-958e-a766b5087e59" containerName="mariadb-database-create" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.495663 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.498881 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.499220 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.501319 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.603316 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe3366d3-09d4-49fb-a388-3291fe1e65b0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.603377 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe3366d3-09d4-49fb-a388-3291fe1e65b0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.603488 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe3366d3-09d4-49fb-a388-3291fe1e65b0-logs\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.603545 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe3366d3-09d4-49fb-a388-3291fe1e65b0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.603566 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe3366d3-09d4-49fb-a388-3291fe1e65b0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.603605 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe3366d3-09d4-49fb-a388-3291fe1e65b0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.603636 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.603659 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hznfb\" (UniqueName: \"kubernetes.io/projected/fe3366d3-09d4-49fb-a388-3291fe1e65b0-kube-api-access-hznfb\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.705975 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe3366d3-09d4-49fb-a388-3291fe1e65b0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.706032 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe3366d3-09d4-49fb-a388-3291fe1e65b0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.706070 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe3366d3-09d4-49fb-a388-3291fe1e65b0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.706100 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.706125 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hznfb\" (UniqueName: \"kubernetes.io/projected/fe3366d3-09d4-49fb-a388-3291fe1e65b0-kube-api-access-hznfb\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.706152 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe3366d3-09d4-49fb-a388-3291fe1e65b0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.706168 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe3366d3-09d4-49fb-a388-3291fe1e65b0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.706325 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe3366d3-09d4-49fb-a388-3291fe1e65b0-logs\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.706785 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe3366d3-09d4-49fb-a388-3291fe1e65b0-logs\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.707922 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.710999 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fe3366d3-09d4-49fb-a388-3291fe1e65b0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.714709 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe3366d3-09d4-49fb-a388-3291fe1e65b0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.721017 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe3366d3-09d4-49fb-a388-3291fe1e65b0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.730761 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe3366d3-09d4-49fb-a388-3291fe1e65b0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.732570 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hznfb\" (UniqueName: \"kubernetes.io/projected/fe3366d3-09d4-49fb-a388-3291fe1e65b0-kube-api-access-hznfb\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.749172 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe3366d3-09d4-49fb-a388-3291fe1e65b0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.759949 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"fe3366d3-09d4-49fb-a388-3291fe1e65b0\") " pod="openstack/glance-default-internal-api-0" Oct 06 13:23:21 crc kubenswrapper[4867]: I1006 13:23:21.872425 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 13:23:22 crc kubenswrapper[4867]: I1006 13:23:22.102915 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c54babca-19bd-4a0b-a320-359b744ed066","Type":"ContainerStarted","Data":"0fcefdec9c5b3bbc2b8724ea3ba1f521337b75b6654265282a891daf7e3b71ac"} Oct 06 13:23:22 crc kubenswrapper[4867]: I1006 13:23:22.106724 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbbe30dd-179b-4e2b-b011-b395c30e32a9","Type":"ContainerStarted","Data":"d26729930a36287c5f82b80dda0d7ade8afe031f0c36de6a05874e992a3c6b13"} Oct 06 13:23:22 crc kubenswrapper[4867]: I1006 13:23:22.111569 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="ceilometer-central-agent" containerID="cri-o://73d860080112e2c44b7c9d4542e43cc2bc5d7db777b34d8c556361a9681aab36" gracePeriod=30 Oct 06 13:23:22 crc kubenswrapper[4867]: I1006 13:23:22.111810 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 13:23:22 crc kubenswrapper[4867]: I1006 13:23:22.111827 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="sg-core" containerID="cri-o://c3a5c5a1c4e7ce5f1b6392f844eefca790f98b6f2891d03c051696fa5318861e" gracePeriod=30 Oct 06 13:23:22 crc kubenswrapper[4867]: I1006 13:23:22.111859 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="ceilometer-notification-agent" containerID="cri-o://534513f162e964710d094492273df846eaebf1265ad97f8532cbe5d57e4af087" gracePeriod=30 Oct 06 13:23:22 crc kubenswrapper[4867]: I1006 13:23:22.111882 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="proxy-httpd" containerID="cri-o://d60d2f62fe51ee425a2dc6d5c1ebadfe98458c87abd6df59860eecbdeaed3bdc" gracePeriod=30 Oct 06 13:23:22 crc kubenswrapper[4867]: I1006 13:23:22.184443 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.553206645 podStartE2EDuration="7.184412795s" podCreationTimestamp="2025-10-06 13:23:15 +0000 UTC" firstStartedPulling="2025-10-06 13:23:17.172876056 +0000 UTC m=+1176.630824200" lastFinishedPulling="2025-10-06 13:23:21.804082206 +0000 UTC m=+1181.262030350" observedRunningTime="2025-10-06 13:23:22.169996841 +0000 UTC m=+1181.627944995" watchObservedRunningTime="2025-10-06 13:23:22.184412795 +0000 UTC m=+1181.642360939" Oct 06 13:23:22 crc kubenswrapper[4867]: I1006 13:23:22.516049 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 13:23:22 crc kubenswrapper[4867]: W1006 13:23:22.564911 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe3366d3_09d4_49fb_a388_3291fe1e65b0.slice/crio-6ce30330f5f941dca71c203a977a2a3d32f52a1450d73b4a7fb4ad5f3f24206b WatchSource:0}: Error finding container 6ce30330f5f941dca71c203a977a2a3d32f52a1450d73b4a7fb4ad5f3f24206b: Status 404 returned error can't find the container with id 6ce30330f5f941dca71c203a977a2a3d32f52a1450d73b4a7fb4ad5f3f24206b Oct 06 13:23:23 crc kubenswrapper[4867]: I1006 13:23:23.179968 4867 generic.go:334] "Generic (PLEG): container finished" podID="4d81f192-aeec-4482-abbd-35e95040a432" containerID="c3a5c5a1c4e7ce5f1b6392f844eefca790f98b6f2891d03c051696fa5318861e" exitCode=2 Oct 06 13:23:23 crc kubenswrapper[4867]: I1006 13:23:23.180520 4867 generic.go:334] "Generic (PLEG): container finished" podID="4d81f192-aeec-4482-abbd-35e95040a432" containerID="534513f162e964710d094492273df846eaebf1265ad97f8532cbe5d57e4af087" exitCode=0 Oct 06 13:23:23 crc kubenswrapper[4867]: I1006 13:23:23.180161 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d81f192-aeec-4482-abbd-35e95040a432","Type":"ContainerStarted","Data":"d60d2f62fe51ee425a2dc6d5c1ebadfe98458c87abd6df59860eecbdeaed3bdc"} Oct 06 13:23:23 crc kubenswrapper[4867]: I1006 13:23:23.180614 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d81f192-aeec-4482-abbd-35e95040a432","Type":"ContainerDied","Data":"c3a5c5a1c4e7ce5f1b6392f844eefca790f98b6f2891d03c051696fa5318861e"} Oct 06 13:23:23 crc kubenswrapper[4867]: I1006 13:23:23.180631 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d81f192-aeec-4482-abbd-35e95040a432","Type":"ContainerDied","Data":"534513f162e964710d094492273df846eaebf1265ad97f8532cbe5d57e4af087"} Oct 06 13:23:23 crc kubenswrapper[4867]: I1006 13:23:23.183641 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbbe30dd-179b-4e2b-b011-b395c30e32a9","Type":"ContainerStarted","Data":"3d6429b2eb9f554d50c403d92aeb20a1bdf72b092e9c2d94851e900d1668aeff"} Oct 06 13:23:23 crc kubenswrapper[4867]: I1006 13:23:23.186101 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe3366d3-09d4-49fb-a388-3291fe1e65b0","Type":"ContainerStarted","Data":"6ce30330f5f941dca71c203a977a2a3d32f52a1450d73b4a7fb4ad5f3f24206b"} Oct 06 13:23:23 crc kubenswrapper[4867]: I1006 13:23:23.233102 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae" path="/var/lib/kubelet/pods/8f7fa4a1-ed62-4a22-ad5b-1030ca8ebaae/volumes" Oct 06 13:23:24 crc kubenswrapper[4867]: I1006 13:23:24.199202 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbbe30dd-179b-4e2b-b011-b395c30e32a9","Type":"ContainerStarted","Data":"3415d41e08986842264cc064ba0c6a4742025d60253be30d624833e1b002ffe3"} Oct 06 13:23:24 crc kubenswrapper[4867]: I1006 13:23:24.203514 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe3366d3-09d4-49fb-a388-3291fe1e65b0","Type":"ContainerStarted","Data":"60d0c121d314179e16c2ff8638045d5f0c3908450012ff04a009728429f2bcc7"} Oct 06 13:23:24 crc kubenswrapper[4867]: I1006 13:23:24.203573 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fe3366d3-09d4-49fb-a388-3291fe1e65b0","Type":"ContainerStarted","Data":"7a50b11ad923421b67b7973f6139f1a178b6979f51e97eeb8a34c3bbe2d92cc0"} Oct 06 13:23:24 crc kubenswrapper[4867]: I1006 13:23:24.229803 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.229781102 podStartE2EDuration="4.229781102s" podCreationTimestamp="2025-10-06 13:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:23:24.224045365 +0000 UTC m=+1183.681993519" watchObservedRunningTime="2025-10-06 13:23:24.229781102 +0000 UTC m=+1183.687729246" Oct 06 13:23:24 crc kubenswrapper[4867]: I1006 13:23:24.269862 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.269825977 podStartE2EDuration="3.269825977s" podCreationTimestamp="2025-10-06 13:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:23:24.248682838 +0000 UTC m=+1183.706630982" watchObservedRunningTime="2025-10-06 13:23:24.269825977 +0000 UTC m=+1183.727774121" Oct 06 13:23:28 crc kubenswrapper[4867]: I1006 13:23:28.974487 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 13:23:29 crc kubenswrapper[4867]: I1006 13:23:29.006569 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 06 13:23:29 crc kubenswrapper[4867]: I1006 13:23:29.260529 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 06 13:23:29 crc kubenswrapper[4867]: I1006 13:23:29.286115 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 06 13:23:29 crc kubenswrapper[4867]: I1006 13:23:29.327817 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:23:30 crc kubenswrapper[4867]: I1006 13:23:30.265979 4867 generic.go:334] "Generic (PLEG): container finished" podID="4d81f192-aeec-4482-abbd-35e95040a432" containerID="73d860080112e2c44b7c9d4542e43cc2bc5d7db777b34d8c556361a9681aab36" exitCode=0 Oct 06 13:23:30 crc kubenswrapper[4867]: I1006 13:23:30.266065 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d81f192-aeec-4482-abbd-35e95040a432","Type":"ContainerDied","Data":"73d860080112e2c44b7c9d4542e43cc2bc5d7db777b34d8c556361a9681aab36"} Oct 06 13:23:30 crc kubenswrapper[4867]: I1006 13:23:30.676746 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 13:23:30 crc kubenswrapper[4867]: I1006 13:23:30.676813 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 13:23:30 crc kubenswrapper[4867]: I1006 13:23:30.704032 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 13:23:30 crc kubenswrapper[4867]: I1006 13:23:30.716456 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 13:23:31 crc kubenswrapper[4867]: I1006 13:23:31.275240 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="c54babca-19bd-4a0b-a320-359b744ed066" containerName="watcher-decision-engine" containerID="cri-o://0fcefdec9c5b3bbc2b8724ea3ba1f521337b75b6654265282a891daf7e3b71ac" gracePeriod=30 Oct 06 13:23:31 crc kubenswrapper[4867]: I1006 13:23:31.275668 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 13:23:31 crc kubenswrapper[4867]: I1006 13:23:31.276034 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 13:23:31 crc kubenswrapper[4867]: I1006 13:23:31.872626 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 13:23:31 crc kubenswrapper[4867]: I1006 13:23:31.872684 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 13:23:31 crc kubenswrapper[4867]: I1006 13:23:31.919898 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 13:23:31 crc kubenswrapper[4867]: I1006 13:23:31.938974 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 13:23:32 crc kubenswrapper[4867]: I1006 13:23:32.291888 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 13:23:32 crc kubenswrapper[4867]: I1006 13:23:32.291978 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 13:23:33 crc kubenswrapper[4867]: I1006 13:23:33.086716 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 13:23:33 crc kubenswrapper[4867]: I1006 13:23:33.110751 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.196749 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.252173 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.315652 4867 generic.go:334] "Generic (PLEG): container finished" podID="c54babca-19bd-4a0b-a320-359b744ed066" containerID="0fcefdec9c5b3bbc2b8724ea3ba1f521337b75b6654265282a891daf7e3b71ac" exitCode=0 Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.315984 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c54babca-19bd-4a0b-a320-359b744ed066","Type":"ContainerDied","Data":"0fcefdec9c5b3bbc2b8724ea3ba1f521337b75b6654265282a891daf7e3b71ac"} Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.316048 4867 scope.go:117] "RemoveContainer" containerID="890eb876f39d7213ab994e4f7878a7aa46933c45bc36e78c02b7645a4eba0da0" Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.472147 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.523778 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-combined-ca-bundle\") pod \"c54babca-19bd-4a0b-a320-359b744ed066\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.525210 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gspp7\" (UniqueName: \"kubernetes.io/projected/c54babca-19bd-4a0b-a320-359b744ed066-kube-api-access-gspp7\") pod \"c54babca-19bd-4a0b-a320-359b744ed066\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.525281 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54babca-19bd-4a0b-a320-359b744ed066-logs\") pod \"c54babca-19bd-4a0b-a320-359b744ed066\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.525426 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-config-data\") pod \"c54babca-19bd-4a0b-a320-359b744ed066\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.525482 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-custom-prometheus-ca\") pod \"c54babca-19bd-4a0b-a320-359b744ed066\" (UID: \"c54babca-19bd-4a0b-a320-359b744ed066\") " Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.527317 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c54babca-19bd-4a0b-a320-359b744ed066-logs" (OuterVolumeSpecName: "logs") pod "c54babca-19bd-4a0b-a320-359b744ed066" (UID: "c54babca-19bd-4a0b-a320-359b744ed066"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.539679 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c54babca-19bd-4a0b-a320-359b744ed066-kube-api-access-gspp7" (OuterVolumeSpecName: "kube-api-access-gspp7") pod "c54babca-19bd-4a0b-a320-359b744ed066" (UID: "c54babca-19bd-4a0b-a320-359b744ed066"). InnerVolumeSpecName "kube-api-access-gspp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.557711 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c54babca-19bd-4a0b-a320-359b744ed066" (UID: "c54babca-19bd-4a0b-a320-359b744ed066"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.570644 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "c54babca-19bd-4a0b-a320-359b744ed066" (UID: "c54babca-19bd-4a0b-a320-359b744ed066"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.608227 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-config-data" (OuterVolumeSpecName: "config-data") pod "c54babca-19bd-4a0b-a320-359b744ed066" (UID: "c54babca-19bd-4a0b-a320-359b744ed066"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.628532 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.628598 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gspp7\" (UniqueName: \"kubernetes.io/projected/c54babca-19bd-4a0b-a320-359b744ed066-kube-api-access-gspp7\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.628617 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c54babca-19bd-4a0b-a320-359b744ed066-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.628629 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:34 crc kubenswrapper[4867]: I1006 13:23:34.628638 4867 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c54babca-19bd-4a0b-a320-359b744ed066-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.328968 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.329475 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"c54babca-19bd-4a0b-a320-359b744ed066","Type":"ContainerDied","Data":"e9e0d6952a07d7ae3820dd4215e92d3944b9e4d2ff2f8669c94d7baeaa6597b7"} Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.329510 4867 scope.go:117] "RemoveContainer" containerID="0fcefdec9c5b3bbc2b8724ea3ba1f521337b75b6654265282a891daf7e3b71ac" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.367331 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.372722 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.393350 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:23:35 crc kubenswrapper[4867]: E1006 13:23:35.393867 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54babca-19bd-4a0b-a320-359b744ed066" containerName="watcher-decision-engine" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.393886 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54babca-19bd-4a0b-a320-359b744ed066" containerName="watcher-decision-engine" Oct 06 13:23:35 crc kubenswrapper[4867]: E1006 13:23:35.393929 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54babca-19bd-4a0b-a320-359b744ed066" containerName="watcher-decision-engine" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.393935 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54babca-19bd-4a0b-a320-359b744ed066" containerName="watcher-decision-engine" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.394150 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54babca-19bd-4a0b-a320-359b744ed066" containerName="watcher-decision-engine" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.394166 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54babca-19bd-4a0b-a320-359b744ed066" containerName="watcher-decision-engine" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.395186 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.397820 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.452685 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.453679 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886a11ab-54f5-45c1-a604-41203d080360-config-data\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.453719 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ptmt\" (UniqueName: \"kubernetes.io/projected/886a11ab-54f5-45c1-a604-41203d080360-kube-api-access-6ptmt\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.454516 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886a11ab-54f5-45c1-a604-41203d080360-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.454684 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/886a11ab-54f5-45c1-a604-41203d080360-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.454732 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/886a11ab-54f5-45c1-a604-41203d080360-logs\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.556972 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886a11ab-54f5-45c1-a604-41203d080360-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.557182 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/886a11ab-54f5-45c1-a604-41203d080360-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.557227 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/886a11ab-54f5-45c1-a604-41203d080360-logs\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.557301 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886a11ab-54f5-45c1-a604-41203d080360-config-data\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.557327 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ptmt\" (UniqueName: \"kubernetes.io/projected/886a11ab-54f5-45c1-a604-41203d080360-kube-api-access-6ptmt\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.558799 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/886a11ab-54f5-45c1-a604-41203d080360-logs\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.563273 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886a11ab-54f5-45c1-a604-41203d080360-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.564797 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886a11ab-54f5-45c1-a604-41203d080360-config-data\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.566103 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/886a11ab-54f5-45c1-a604-41203d080360-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.576340 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ptmt\" (UniqueName: \"kubernetes.io/projected/886a11ab-54f5-45c1-a604-41203d080360-kube-api-access-6ptmt\") pod \"watcher-decision-engine-0\" (UID: \"886a11ab-54f5-45c1-a604-41203d080360\") " pod="openstack/watcher-decision-engine-0" Oct 06 13:23:35 crc kubenswrapper[4867]: I1006 13:23:35.716811 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.039336 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6c46-account-create-f62ww"] Oct 06 13:23:36 crc kubenswrapper[4867]: E1006 13:23:36.040408 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54babca-19bd-4a0b-a320-359b744ed066" containerName="watcher-decision-engine" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.040426 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54babca-19bd-4a0b-a320-359b744ed066" containerName="watcher-decision-engine" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.040703 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54babca-19bd-4a0b-a320-359b744ed066" containerName="watcher-decision-engine" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.041748 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c46-account-create-f62ww" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.049144 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.054461 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6c46-account-create-f62ww"] Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.100331 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vccl\" (UniqueName: \"kubernetes.io/projected/1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114-kube-api-access-5vccl\") pod \"nova-api-6c46-account-create-f62ww\" (UID: \"1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114\") " pod="openstack/nova-api-6c46-account-create-f62ww" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.117992 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.212454 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vccl\" (UniqueName: \"kubernetes.io/projected/1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114-kube-api-access-5vccl\") pod \"nova-api-6c46-account-create-f62ww\" (UID: \"1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114\") " pod="openstack/nova-api-6c46-account-create-f62ww" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.285923 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vccl\" (UniqueName: \"kubernetes.io/projected/1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114-kube-api-access-5vccl\") pod \"nova-api-6c46-account-create-f62ww\" (UID: \"1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114\") " pod="openstack/nova-api-6c46-account-create-f62ww" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.323489 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-33f7-account-create-t9qfg"] Oct 06 13:23:36 crc kubenswrapper[4867]: E1006 13:23:36.323915 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54babca-19bd-4a0b-a320-359b744ed066" containerName="watcher-decision-engine" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.323929 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54babca-19bd-4a0b-a320-359b744ed066" containerName="watcher-decision-engine" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.324192 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54babca-19bd-4a0b-a320-359b744ed066" containerName="watcher-decision-engine" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.324916 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-33f7-account-create-t9qfg" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.332826 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.357261 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-33f7-account-create-t9qfg"] Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.374084 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c46-account-create-f62ww" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.406522 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"886a11ab-54f5-45c1-a604-41203d080360","Type":"ContainerStarted","Data":"77835ef50d03aacb93b118c7ae1da0ecb209c9e35e60defff6748f6f4788b21c"} Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.414534 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ea86-account-create-jdmnp"] Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.416865 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea86-account-create-jdmnp" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.421461 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49k74\" (UniqueName: \"kubernetes.io/projected/40974151-3ffd-421d-8a05-acb6b0ea7f0e-kube-api-access-49k74\") pod \"nova-cell0-33f7-account-create-t9qfg\" (UID: \"40974151-3ffd-421d-8a05-acb6b0ea7f0e\") " pod="openstack/nova-cell0-33f7-account-create-t9qfg" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.421970 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.455703 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ea86-account-create-jdmnp"] Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.523038 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49k74\" (UniqueName: \"kubernetes.io/projected/40974151-3ffd-421d-8a05-acb6b0ea7f0e-kube-api-access-49k74\") pod \"nova-cell0-33f7-account-create-t9qfg\" (UID: \"40974151-3ffd-421d-8a05-acb6b0ea7f0e\") " pod="openstack/nova-cell0-33f7-account-create-t9qfg" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.523091 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2prs7\" (UniqueName: \"kubernetes.io/projected/c9250a9b-2ae9-4759-ad53-a8ff5016200f-kube-api-access-2prs7\") pod \"nova-cell1-ea86-account-create-jdmnp\" (UID: \"c9250a9b-2ae9-4759-ad53-a8ff5016200f\") " pod="openstack/nova-cell1-ea86-account-create-jdmnp" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.546602 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49k74\" (UniqueName: \"kubernetes.io/projected/40974151-3ffd-421d-8a05-acb6b0ea7f0e-kube-api-access-49k74\") pod \"nova-cell0-33f7-account-create-t9qfg\" (UID: \"40974151-3ffd-421d-8a05-acb6b0ea7f0e\") " pod="openstack/nova-cell0-33f7-account-create-t9qfg" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.625416 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2prs7\" (UniqueName: \"kubernetes.io/projected/c9250a9b-2ae9-4759-ad53-a8ff5016200f-kube-api-access-2prs7\") pod \"nova-cell1-ea86-account-create-jdmnp\" (UID: \"c9250a9b-2ae9-4759-ad53-a8ff5016200f\") " pod="openstack/nova-cell1-ea86-account-create-jdmnp" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.649095 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2prs7\" (UniqueName: \"kubernetes.io/projected/c9250a9b-2ae9-4759-ad53-a8ff5016200f-kube-api-access-2prs7\") pod \"nova-cell1-ea86-account-create-jdmnp\" (UID: \"c9250a9b-2ae9-4759-ad53-a8ff5016200f\") " pod="openstack/nova-cell1-ea86-account-create-jdmnp" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.743308 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-33f7-account-create-t9qfg" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.776788 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea86-account-create-jdmnp" Oct 06 13:23:36 crc kubenswrapper[4867]: I1006 13:23:36.962174 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6c46-account-create-f62ww"] Oct 06 13:23:37 crc kubenswrapper[4867]: W1006 13:23:37.232025 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40974151_3ffd_421d_8a05_acb6b0ea7f0e.slice/crio-09529fff36aebfa3ff05a57574f04cd40faa9069943a545aeaac8b3f06f8acbc WatchSource:0}: Error finding container 09529fff36aebfa3ff05a57574f04cd40faa9069943a545aeaac8b3f06f8acbc: Status 404 returned error can't find the container with id 09529fff36aebfa3ff05a57574f04cd40faa9069943a545aeaac8b3f06f8acbc Oct 06 13:23:37 crc kubenswrapper[4867]: I1006 13:23:37.235779 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c54babca-19bd-4a0b-a320-359b744ed066" path="/var/lib/kubelet/pods/c54babca-19bd-4a0b-a320-359b744ed066/volumes" Oct 06 13:23:37 crc kubenswrapper[4867]: I1006 13:23:37.236827 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-33f7-account-create-t9qfg"] Oct 06 13:23:37 crc kubenswrapper[4867]: I1006 13:23:37.363850 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ea86-account-create-jdmnp"] Oct 06 13:23:37 crc kubenswrapper[4867]: I1006 13:23:37.421286 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"886a11ab-54f5-45c1-a604-41203d080360","Type":"ContainerStarted","Data":"4d92d9424a5e87e7747e0952af5cbe5f56669052462e83e261db4ba8ff46d3ad"} Oct 06 13:23:37 crc kubenswrapper[4867]: I1006 13:23:37.436683 4867 generic.go:334] "Generic (PLEG): container finished" podID="1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114" containerID="7558fb15c1acf747f5b9ae73be1d9d7d0898ae57dcbdf4cb85aa0ce1a31db1b6" exitCode=0 Oct 06 13:23:37 crc kubenswrapper[4867]: I1006 13:23:37.436845 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c46-account-create-f62ww" event={"ID":"1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114","Type":"ContainerDied","Data":"7558fb15c1acf747f5b9ae73be1d9d7d0898ae57dcbdf4cb85aa0ce1a31db1b6"} Oct 06 13:23:37 crc kubenswrapper[4867]: I1006 13:23:37.436891 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c46-account-create-f62ww" event={"ID":"1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114","Type":"ContainerStarted","Data":"a1cf092a7497b69b05788a85de86c72f17d098a637e5a031f140d01634cea370"} Oct 06 13:23:37 crc kubenswrapper[4867]: I1006 13:23:37.439129 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-33f7-account-create-t9qfg" event={"ID":"40974151-3ffd-421d-8a05-acb6b0ea7f0e","Type":"ContainerStarted","Data":"09529fff36aebfa3ff05a57574f04cd40faa9069943a545aeaac8b3f06f8acbc"} Oct 06 13:23:37 crc kubenswrapper[4867]: I1006 13:23:37.440138 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ea86-account-create-jdmnp" event={"ID":"c9250a9b-2ae9-4759-ad53-a8ff5016200f","Type":"ContainerStarted","Data":"f349cf83fb123bffdd5aa3003e48a3c348d65437701fec4ff018c08e3448d708"} Oct 06 13:23:37 crc kubenswrapper[4867]: I1006 13:23:37.463325 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.463303093 podStartE2EDuration="2.463303093s" podCreationTimestamp="2025-10-06 13:23:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:23:37.448150109 +0000 UTC m=+1196.906098253" watchObservedRunningTime="2025-10-06 13:23:37.463303093 +0000 UTC m=+1196.921251257" Oct 06 13:23:38 crc kubenswrapper[4867]: I1006 13:23:38.450741 4867 generic.go:334] "Generic (PLEG): container finished" podID="40974151-3ffd-421d-8a05-acb6b0ea7f0e" containerID="84d23888b58dea6466f6c6a45b07358d5daf7bd5d55f1211cd35aec12a43554c" exitCode=0 Oct 06 13:23:38 crc kubenswrapper[4867]: I1006 13:23:38.450834 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-33f7-account-create-t9qfg" event={"ID":"40974151-3ffd-421d-8a05-acb6b0ea7f0e","Type":"ContainerDied","Data":"84d23888b58dea6466f6c6a45b07358d5daf7bd5d55f1211cd35aec12a43554c"} Oct 06 13:23:38 crc kubenswrapper[4867]: I1006 13:23:38.452551 4867 generic.go:334] "Generic (PLEG): container finished" podID="c9250a9b-2ae9-4759-ad53-a8ff5016200f" containerID="c3d92978090554240239a433470df8ae70cdbb5a44d699d61300cf84f5db208b" exitCode=0 Oct 06 13:23:38 crc kubenswrapper[4867]: I1006 13:23:38.452659 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ea86-account-create-jdmnp" event={"ID":"c9250a9b-2ae9-4759-ad53-a8ff5016200f","Type":"ContainerDied","Data":"c3d92978090554240239a433470df8ae70cdbb5a44d699d61300cf84f5db208b"} Oct 06 13:23:38 crc kubenswrapper[4867]: I1006 13:23:38.826692 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c46-account-create-f62ww" Oct 06 13:23:38 crc kubenswrapper[4867]: I1006 13:23:38.904604 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vccl\" (UniqueName: \"kubernetes.io/projected/1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114-kube-api-access-5vccl\") pod \"1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114\" (UID: \"1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114\") " Oct 06 13:23:38 crc kubenswrapper[4867]: I1006 13:23:38.913557 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114-kube-api-access-5vccl" (OuterVolumeSpecName: "kube-api-access-5vccl") pod "1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114" (UID: "1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114"). InnerVolumeSpecName "kube-api-access-5vccl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:23:39 crc kubenswrapper[4867]: I1006 13:23:39.007135 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vccl\" (UniqueName: \"kubernetes.io/projected/1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114-kube-api-access-5vccl\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:39 crc kubenswrapper[4867]: I1006 13:23:39.468140 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c46-account-create-f62ww" Oct 06 13:23:39 crc kubenswrapper[4867]: I1006 13:23:39.468669 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c46-account-create-f62ww" event={"ID":"1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114","Type":"ContainerDied","Data":"a1cf092a7497b69b05788a85de86c72f17d098a637e5a031f140d01634cea370"} Oct 06 13:23:39 crc kubenswrapper[4867]: I1006 13:23:39.468737 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1cf092a7497b69b05788a85de86c72f17d098a637e5a031f140d01634cea370" Oct 06 13:23:39 crc kubenswrapper[4867]: I1006 13:23:39.874510 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-33f7-account-create-t9qfg" Oct 06 13:23:39 crc kubenswrapper[4867]: I1006 13:23:39.932505 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49k74\" (UniqueName: \"kubernetes.io/projected/40974151-3ffd-421d-8a05-acb6b0ea7f0e-kube-api-access-49k74\") pod \"40974151-3ffd-421d-8a05-acb6b0ea7f0e\" (UID: \"40974151-3ffd-421d-8a05-acb6b0ea7f0e\") " Oct 06 13:23:39 crc kubenswrapper[4867]: I1006 13:23:39.939532 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40974151-3ffd-421d-8a05-acb6b0ea7f0e-kube-api-access-49k74" (OuterVolumeSpecName: "kube-api-access-49k74") pod "40974151-3ffd-421d-8a05-acb6b0ea7f0e" (UID: "40974151-3ffd-421d-8a05-acb6b0ea7f0e"). InnerVolumeSpecName "kube-api-access-49k74". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:23:40 crc kubenswrapper[4867]: I1006 13:23:40.017457 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea86-account-create-jdmnp" Oct 06 13:23:40 crc kubenswrapper[4867]: I1006 13:23:40.034687 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49k74\" (UniqueName: \"kubernetes.io/projected/40974151-3ffd-421d-8a05-acb6b0ea7f0e-kube-api-access-49k74\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:40 crc kubenswrapper[4867]: I1006 13:23:40.136088 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2prs7\" (UniqueName: \"kubernetes.io/projected/c9250a9b-2ae9-4759-ad53-a8ff5016200f-kube-api-access-2prs7\") pod \"c9250a9b-2ae9-4759-ad53-a8ff5016200f\" (UID: \"c9250a9b-2ae9-4759-ad53-a8ff5016200f\") " Oct 06 13:23:40 crc kubenswrapper[4867]: I1006 13:23:40.138952 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9250a9b-2ae9-4759-ad53-a8ff5016200f-kube-api-access-2prs7" (OuterVolumeSpecName: "kube-api-access-2prs7") pod "c9250a9b-2ae9-4759-ad53-a8ff5016200f" (UID: "c9250a9b-2ae9-4759-ad53-a8ff5016200f"). InnerVolumeSpecName "kube-api-access-2prs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:23:40 crc kubenswrapper[4867]: I1006 13:23:40.238322 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2prs7\" (UniqueName: \"kubernetes.io/projected/c9250a9b-2ae9-4759-ad53-a8ff5016200f-kube-api-access-2prs7\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:40 crc kubenswrapper[4867]: I1006 13:23:40.483537 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-33f7-account-create-t9qfg" event={"ID":"40974151-3ffd-421d-8a05-acb6b0ea7f0e","Type":"ContainerDied","Data":"09529fff36aebfa3ff05a57574f04cd40faa9069943a545aeaac8b3f06f8acbc"} Oct 06 13:23:40 crc kubenswrapper[4867]: I1006 13:23:40.483626 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-33f7-account-create-t9qfg" Oct 06 13:23:40 crc kubenswrapper[4867]: I1006 13:23:40.483631 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09529fff36aebfa3ff05a57574f04cd40faa9069943a545aeaac8b3f06f8acbc" Oct 06 13:23:40 crc kubenswrapper[4867]: I1006 13:23:40.487078 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ea86-account-create-jdmnp" event={"ID":"c9250a9b-2ae9-4759-ad53-a8ff5016200f","Type":"ContainerDied","Data":"f349cf83fb123bffdd5aa3003e48a3c348d65437701fec4ff018c08e3448d708"} Oct 06 13:23:40 crc kubenswrapper[4867]: I1006 13:23:40.487328 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f349cf83fb123bffdd5aa3003e48a3c348d65437701fec4ff018c08e3448d708" Oct 06 13:23:40 crc kubenswrapper[4867]: I1006 13:23:40.487126 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea86-account-create-jdmnp" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.572488 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6jf29"] Oct 06 13:23:41 crc kubenswrapper[4867]: E1006 13:23:41.574475 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9250a9b-2ae9-4759-ad53-a8ff5016200f" containerName="mariadb-account-create" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.574520 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9250a9b-2ae9-4759-ad53-a8ff5016200f" containerName="mariadb-account-create" Oct 06 13:23:41 crc kubenswrapper[4867]: E1006 13:23:41.574537 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114" containerName="mariadb-account-create" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.574546 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114" containerName="mariadb-account-create" Oct 06 13:23:41 crc kubenswrapper[4867]: E1006 13:23:41.574573 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40974151-3ffd-421d-8a05-acb6b0ea7f0e" containerName="mariadb-account-create" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.574581 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="40974151-3ffd-421d-8a05-acb6b0ea7f0e" containerName="mariadb-account-create" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.574815 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="40974151-3ffd-421d-8a05-acb6b0ea7f0e" containerName="mariadb-account-create" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.574847 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114" containerName="mariadb-account-create" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.574864 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9250a9b-2ae9-4759-ad53-a8ff5016200f" containerName="mariadb-account-create" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.575591 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.578194 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.578617 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.581985 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vqvtf" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.584540 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6jf29"] Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.668825 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7rl7\" (UniqueName: \"kubernetes.io/projected/76c135ec-9d78-4745-b065-e035c34fc51c-kube-api-access-k7rl7\") pod \"nova-cell0-conductor-db-sync-6jf29\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.668882 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-config-data\") pod \"nova-cell0-conductor-db-sync-6jf29\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.668997 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-scripts\") pod \"nova-cell0-conductor-db-sync-6jf29\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.669030 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6jf29\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.772246 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-scripts\") pod \"nova-cell0-conductor-db-sync-6jf29\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.772421 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6jf29\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.772769 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7rl7\" (UniqueName: \"kubernetes.io/projected/76c135ec-9d78-4745-b065-e035c34fc51c-kube-api-access-k7rl7\") pod \"nova-cell0-conductor-db-sync-6jf29\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.772871 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-config-data\") pod \"nova-cell0-conductor-db-sync-6jf29\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.780114 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-config-data\") pod \"nova-cell0-conductor-db-sync-6jf29\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.781225 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-scripts\") pod \"nova-cell0-conductor-db-sync-6jf29\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.783574 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6jf29\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.791512 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7rl7\" (UniqueName: \"kubernetes.io/projected/76c135ec-9d78-4745-b065-e035c34fc51c-kube-api-access-k7rl7\") pod \"nova-cell0-conductor-db-sync-6jf29\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:23:41 crc kubenswrapper[4867]: I1006 13:23:41.906729 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:23:42 crc kubenswrapper[4867]: I1006 13:23:42.403871 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:23:42 crc kubenswrapper[4867]: I1006 13:23:42.408017 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6jf29"] Oct 06 13:23:42 crc kubenswrapper[4867]: I1006 13:23:42.515994 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6jf29" event={"ID":"76c135ec-9d78-4745-b065-e035c34fc51c","Type":"ContainerStarted","Data":"6788598a1765daf7d00fec6cd368e576ed0e5eec263667e39761f4a29c9a50c6"} Oct 06 13:23:42 crc kubenswrapper[4867]: I1006 13:23:42.874657 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:23:42 crc kubenswrapper[4867]: I1006 13:23:42.875086 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:23:45 crc kubenswrapper[4867]: I1006 13:23:45.717317 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 13:23:45 crc kubenswrapper[4867]: I1006 13:23:45.754554 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 06 13:23:46 crc kubenswrapper[4867]: I1006 13:23:46.471896 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 06 13:23:46 crc kubenswrapper[4867]: I1006 13:23:46.565716 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 06 13:23:46 crc kubenswrapper[4867]: I1006 13:23:46.600181 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 06 13:23:50 crc kubenswrapper[4867]: I1006 13:23:50.156701 4867 kubelet.go:1505] "Image garbage collection succeeded" Oct 06 13:23:50 crc kubenswrapper[4867]: I1006 13:23:50.624000 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6jf29" event={"ID":"76c135ec-9d78-4745-b065-e035c34fc51c","Type":"ContainerStarted","Data":"f6b312f605aa716b29d47faf3a897e9c60c9bcb109435e203fe7a221ed7b755d"} Oct 06 13:23:52 crc kubenswrapper[4867]: E1006 13:23:52.511605 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d81f192_aeec_4482_abbd_35e95040a432.slice/crio-conmon-d60d2f62fe51ee425a2dc6d5c1ebadfe98458c87abd6df59860eecbdeaed3bdc.scope\": RecentStats: unable to find data in memory cache]" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.602401 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.638650 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-6jf29" podStartSLOduration=4.106380867 podStartE2EDuration="11.638356962s" podCreationTimestamp="2025-10-06 13:23:41 +0000 UTC" firstStartedPulling="2025-10-06 13:23:42.403632657 +0000 UTC m=+1201.861580801" lastFinishedPulling="2025-10-06 13:23:49.935608712 +0000 UTC m=+1209.393556896" observedRunningTime="2025-10-06 13:23:50.64983654 +0000 UTC m=+1210.107784724" watchObservedRunningTime="2025-10-06 13:23:52.638356962 +0000 UTC m=+1212.096305106" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.658533 4867 generic.go:334] "Generic (PLEG): container finished" podID="4d81f192-aeec-4482-abbd-35e95040a432" containerID="d60d2f62fe51ee425a2dc6d5c1ebadfe98458c87abd6df59860eecbdeaed3bdc" exitCode=137 Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.658580 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d81f192-aeec-4482-abbd-35e95040a432","Type":"ContainerDied","Data":"d60d2f62fe51ee425a2dc6d5c1ebadfe98458c87abd6df59860eecbdeaed3bdc"} Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.658610 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d81f192-aeec-4482-abbd-35e95040a432","Type":"ContainerDied","Data":"66241a60969f3769146e136cef7b5b92d3e568ad2ed4f74ccf2ca9a5af328f8d"} Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.658628 4867 scope.go:117] "RemoveContainer" containerID="d60d2f62fe51ee425a2dc6d5c1ebadfe98458c87abd6df59860eecbdeaed3bdc" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.658780 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.690982 4867 scope.go:117] "RemoveContainer" containerID="c3a5c5a1c4e7ce5f1b6392f844eefca790f98b6f2891d03c051696fa5318861e" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.712938 4867 scope.go:117] "RemoveContainer" containerID="534513f162e964710d094492273df846eaebf1265ad97f8532cbe5d57e4af087" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.739614 4867 scope.go:117] "RemoveContainer" containerID="73d860080112e2c44b7c9d4542e43cc2bc5d7db777b34d8c556361a9681aab36" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.755150 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-config-data\") pod \"4d81f192-aeec-4482-abbd-35e95040a432\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.755237 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d81f192-aeec-4482-abbd-35e95040a432-run-httpd\") pod \"4d81f192-aeec-4482-abbd-35e95040a432\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.755513 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj9wm\" (UniqueName: \"kubernetes.io/projected/4d81f192-aeec-4482-abbd-35e95040a432-kube-api-access-lj9wm\") pod \"4d81f192-aeec-4482-abbd-35e95040a432\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.755574 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d81f192-aeec-4482-abbd-35e95040a432-log-httpd\") pod \"4d81f192-aeec-4482-abbd-35e95040a432\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.755659 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-scripts\") pod \"4d81f192-aeec-4482-abbd-35e95040a432\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.755708 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-sg-core-conf-yaml\") pod \"4d81f192-aeec-4482-abbd-35e95040a432\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.755804 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-combined-ca-bundle\") pod \"4d81f192-aeec-4482-abbd-35e95040a432\" (UID: \"4d81f192-aeec-4482-abbd-35e95040a432\") " Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.756307 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d81f192-aeec-4482-abbd-35e95040a432-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4d81f192-aeec-4482-abbd-35e95040a432" (UID: "4d81f192-aeec-4482-abbd-35e95040a432"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.756453 4867 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d81f192-aeec-4482-abbd-35e95040a432-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.756779 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d81f192-aeec-4482-abbd-35e95040a432-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4d81f192-aeec-4482-abbd-35e95040a432" (UID: "4d81f192-aeec-4482-abbd-35e95040a432"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.764771 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-scripts" (OuterVolumeSpecName: "scripts") pod "4d81f192-aeec-4482-abbd-35e95040a432" (UID: "4d81f192-aeec-4482-abbd-35e95040a432"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.767488 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d81f192-aeec-4482-abbd-35e95040a432-kube-api-access-lj9wm" (OuterVolumeSpecName: "kube-api-access-lj9wm") pod "4d81f192-aeec-4482-abbd-35e95040a432" (UID: "4d81f192-aeec-4482-abbd-35e95040a432"). InnerVolumeSpecName "kube-api-access-lj9wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.769538 4867 scope.go:117] "RemoveContainer" containerID="d60d2f62fe51ee425a2dc6d5c1ebadfe98458c87abd6df59860eecbdeaed3bdc" Oct 06 13:23:52 crc kubenswrapper[4867]: E1006 13:23:52.769953 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d60d2f62fe51ee425a2dc6d5c1ebadfe98458c87abd6df59860eecbdeaed3bdc\": container with ID starting with d60d2f62fe51ee425a2dc6d5c1ebadfe98458c87abd6df59860eecbdeaed3bdc not found: ID does not exist" containerID="d60d2f62fe51ee425a2dc6d5c1ebadfe98458c87abd6df59860eecbdeaed3bdc" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.769994 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d60d2f62fe51ee425a2dc6d5c1ebadfe98458c87abd6df59860eecbdeaed3bdc"} err="failed to get container status \"d60d2f62fe51ee425a2dc6d5c1ebadfe98458c87abd6df59860eecbdeaed3bdc\": rpc error: code = NotFound desc = could not find container \"d60d2f62fe51ee425a2dc6d5c1ebadfe98458c87abd6df59860eecbdeaed3bdc\": container with ID starting with d60d2f62fe51ee425a2dc6d5c1ebadfe98458c87abd6df59860eecbdeaed3bdc not found: ID does not exist" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.770021 4867 scope.go:117] "RemoveContainer" containerID="c3a5c5a1c4e7ce5f1b6392f844eefca790f98b6f2891d03c051696fa5318861e" Oct 06 13:23:52 crc kubenswrapper[4867]: E1006 13:23:52.770286 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3a5c5a1c4e7ce5f1b6392f844eefca790f98b6f2891d03c051696fa5318861e\": container with ID starting with c3a5c5a1c4e7ce5f1b6392f844eefca790f98b6f2891d03c051696fa5318861e not found: ID does not exist" containerID="c3a5c5a1c4e7ce5f1b6392f844eefca790f98b6f2891d03c051696fa5318861e" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.770315 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a5c5a1c4e7ce5f1b6392f844eefca790f98b6f2891d03c051696fa5318861e"} err="failed to get container status \"c3a5c5a1c4e7ce5f1b6392f844eefca790f98b6f2891d03c051696fa5318861e\": rpc error: code = NotFound desc = could not find container \"c3a5c5a1c4e7ce5f1b6392f844eefca790f98b6f2891d03c051696fa5318861e\": container with ID starting with c3a5c5a1c4e7ce5f1b6392f844eefca790f98b6f2891d03c051696fa5318861e not found: ID does not exist" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.770331 4867 scope.go:117] "RemoveContainer" containerID="534513f162e964710d094492273df846eaebf1265ad97f8532cbe5d57e4af087" Oct 06 13:23:52 crc kubenswrapper[4867]: E1006 13:23:52.770524 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"534513f162e964710d094492273df846eaebf1265ad97f8532cbe5d57e4af087\": container with ID starting with 534513f162e964710d094492273df846eaebf1265ad97f8532cbe5d57e4af087 not found: ID does not exist" containerID="534513f162e964710d094492273df846eaebf1265ad97f8532cbe5d57e4af087" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.770551 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"534513f162e964710d094492273df846eaebf1265ad97f8532cbe5d57e4af087"} err="failed to get container status \"534513f162e964710d094492273df846eaebf1265ad97f8532cbe5d57e4af087\": rpc error: code = NotFound desc = could not find container \"534513f162e964710d094492273df846eaebf1265ad97f8532cbe5d57e4af087\": container with ID starting with 534513f162e964710d094492273df846eaebf1265ad97f8532cbe5d57e4af087 not found: ID does not exist" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.770567 4867 scope.go:117] "RemoveContainer" containerID="73d860080112e2c44b7c9d4542e43cc2bc5d7db777b34d8c556361a9681aab36" Oct 06 13:23:52 crc kubenswrapper[4867]: E1006 13:23:52.770778 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d860080112e2c44b7c9d4542e43cc2bc5d7db777b34d8c556361a9681aab36\": container with ID starting with 73d860080112e2c44b7c9d4542e43cc2bc5d7db777b34d8c556361a9681aab36 not found: ID does not exist" containerID="73d860080112e2c44b7c9d4542e43cc2bc5d7db777b34d8c556361a9681aab36" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.770805 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d860080112e2c44b7c9d4542e43cc2bc5d7db777b34d8c556361a9681aab36"} err="failed to get container status \"73d860080112e2c44b7c9d4542e43cc2bc5d7db777b34d8c556361a9681aab36\": rpc error: code = NotFound desc = could not find container \"73d860080112e2c44b7c9d4542e43cc2bc5d7db777b34d8c556361a9681aab36\": container with ID starting with 73d860080112e2c44b7c9d4542e43cc2bc5d7db777b34d8c556361a9681aab36 not found: ID does not exist" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.792916 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4d81f192-aeec-4482-abbd-35e95040a432" (UID: "4d81f192-aeec-4482-abbd-35e95040a432"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.860409 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj9wm\" (UniqueName: \"kubernetes.io/projected/4d81f192-aeec-4482-abbd-35e95040a432-kube-api-access-lj9wm\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.861016 4867 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d81f192-aeec-4482-abbd-35e95040a432-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.861034 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.861050 4867 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.874202 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d81f192-aeec-4482-abbd-35e95040a432" (UID: "4d81f192-aeec-4482-abbd-35e95040a432"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.907620 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-config-data" (OuterVolumeSpecName: "config-data") pod "4d81f192-aeec-4482-abbd-35e95040a432" (UID: "4d81f192-aeec-4482-abbd-35e95040a432"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.964243 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.964330 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d81f192-aeec-4482-abbd-35e95040a432-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:23:52 crc kubenswrapper[4867]: I1006 13:23:52.995881 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.004655 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.050468 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:53 crc kubenswrapper[4867]: E1006 13:23:53.065089 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="ceilometer-central-agent" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.065504 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="ceilometer-central-agent" Oct 06 13:23:53 crc kubenswrapper[4867]: E1006 13:23:53.065695 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="sg-core" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.065852 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="sg-core" Oct 06 13:23:53 crc kubenswrapper[4867]: E1006 13:23:53.066061 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="ceilometer-notification-agent" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.066294 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="ceilometer-notification-agent" Oct 06 13:23:53 crc kubenswrapper[4867]: E1006 13:23:53.066502 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="proxy-httpd" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.066729 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="proxy-httpd" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.067399 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="proxy-httpd" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.067682 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="ceilometer-notification-agent" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.067919 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="ceilometer-central-agent" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.068121 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d81f192-aeec-4482-abbd-35e95040a432" containerName="sg-core" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.073017 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.073321 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.076237 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.076319 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.173867 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-config-data\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.174386 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64fb7edb-92e9-4b44-b05a-667d470f7b10-log-httpd\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.174415 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.174570 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.174616 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwlsf\" (UniqueName: \"kubernetes.io/projected/64fb7edb-92e9-4b44-b05a-667d470f7b10-kube-api-access-nwlsf\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.174643 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-scripts\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.174667 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64fb7edb-92e9-4b44-b05a-667d470f7b10-run-httpd\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.232894 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d81f192-aeec-4482-abbd-35e95040a432" path="/var/lib/kubelet/pods/4d81f192-aeec-4482-abbd-35e95040a432/volumes" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.277346 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.277459 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwlsf\" (UniqueName: \"kubernetes.io/projected/64fb7edb-92e9-4b44-b05a-667d470f7b10-kube-api-access-nwlsf\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.277516 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-scripts\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.277558 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64fb7edb-92e9-4b44-b05a-667d470f7b10-run-httpd\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.278216 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-config-data\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.278371 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64fb7edb-92e9-4b44-b05a-667d470f7b10-log-httpd\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.278399 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.279097 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64fb7edb-92e9-4b44-b05a-667d470f7b10-run-httpd\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.279151 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64fb7edb-92e9-4b44-b05a-667d470f7b10-log-httpd\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.284862 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.284864 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-scripts\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.292834 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-config-data\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.299389 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwlsf\" (UniqueName: \"kubernetes.io/projected/64fb7edb-92e9-4b44-b05a-667d470f7b10-kube-api-access-nwlsf\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.299446 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.397020 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:23:53 crc kubenswrapper[4867]: I1006 13:23:53.748780 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:23:53 crc kubenswrapper[4867]: W1006 13:23:53.759840 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64fb7edb_92e9_4b44_b05a_667d470f7b10.slice/crio-c9923a040510e66161d89264474d37ad0b934e7ef1fceceb725a7a624d1faab3 WatchSource:0}: Error finding container c9923a040510e66161d89264474d37ad0b934e7ef1fceceb725a7a624d1faab3: Status 404 returned error can't find the container with id c9923a040510e66161d89264474d37ad0b934e7ef1fceceb725a7a624d1faab3 Oct 06 13:23:54 crc kubenswrapper[4867]: I1006 13:23:54.771273 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64fb7edb-92e9-4b44-b05a-667d470f7b10","Type":"ContainerStarted","Data":"e0a580e6a0daebb9815f50e20686e3bb7251077cab2afa1f34f86b0aa99ae106"} Oct 06 13:23:54 crc kubenswrapper[4867]: I1006 13:23:54.771484 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64fb7edb-92e9-4b44-b05a-667d470f7b10","Type":"ContainerStarted","Data":"c57edc48609e65fde5369175114434967c4882943e1979b777657bda75fe0bdb"} Oct 06 13:23:54 crc kubenswrapper[4867]: I1006 13:23:54.771512 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64fb7edb-92e9-4b44-b05a-667d470f7b10","Type":"ContainerStarted","Data":"c9923a040510e66161d89264474d37ad0b934e7ef1fceceb725a7a624d1faab3"} Oct 06 13:23:56 crc kubenswrapper[4867]: I1006 13:23:56.794889 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64fb7edb-92e9-4b44-b05a-667d470f7b10","Type":"ContainerStarted","Data":"1ed25887d16f76930f9b625495510144e00929118f3d31c30806f582ffd83626"} Oct 06 13:23:57 crc kubenswrapper[4867]: I1006 13:23:57.820702 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64fb7edb-92e9-4b44-b05a-667d470f7b10","Type":"ContainerStarted","Data":"cdfe0ab8b58aa25595ad138ac85533d85ff017b824419a1c68e14ab1dc9bbf02"} Oct 06 13:23:57 crc kubenswrapper[4867]: I1006 13:23:57.821691 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 13:23:57 crc kubenswrapper[4867]: I1006 13:23:57.853747 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.164823909 podStartE2EDuration="5.853722444s" podCreationTimestamp="2025-10-06 13:23:52 +0000 UTC" firstStartedPulling="2025-10-06 13:23:53.764135133 +0000 UTC m=+1213.222083277" lastFinishedPulling="2025-10-06 13:23:57.453033668 +0000 UTC m=+1216.910981812" observedRunningTime="2025-10-06 13:23:57.842680242 +0000 UTC m=+1217.300628396" watchObservedRunningTime="2025-10-06 13:23:57.853722444 +0000 UTC m=+1217.311670588" Oct 06 13:24:04 crc kubenswrapper[4867]: I1006 13:24:04.892744 4867 generic.go:334] "Generic (PLEG): container finished" podID="76c135ec-9d78-4745-b065-e035c34fc51c" containerID="f6b312f605aa716b29d47faf3a897e9c60c9bcb109435e203fe7a221ed7b755d" exitCode=0 Oct 06 13:24:04 crc kubenswrapper[4867]: I1006 13:24:04.892841 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6jf29" event={"ID":"76c135ec-9d78-4745-b065-e035c34fc51c","Type":"ContainerDied","Data":"f6b312f605aa716b29d47faf3a897e9c60c9bcb109435e203fe7a221ed7b755d"} Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.364500 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.429830 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-combined-ca-bundle\") pod \"76c135ec-9d78-4745-b065-e035c34fc51c\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.429879 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7rl7\" (UniqueName: \"kubernetes.io/projected/76c135ec-9d78-4745-b065-e035c34fc51c-kube-api-access-k7rl7\") pod \"76c135ec-9d78-4745-b065-e035c34fc51c\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.430009 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-scripts\") pod \"76c135ec-9d78-4745-b065-e035c34fc51c\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.430097 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-config-data\") pod \"76c135ec-9d78-4745-b065-e035c34fc51c\" (UID: \"76c135ec-9d78-4745-b065-e035c34fc51c\") " Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.436970 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76c135ec-9d78-4745-b065-e035c34fc51c-kube-api-access-k7rl7" (OuterVolumeSpecName: "kube-api-access-k7rl7") pod "76c135ec-9d78-4745-b065-e035c34fc51c" (UID: "76c135ec-9d78-4745-b065-e035c34fc51c"). InnerVolumeSpecName "kube-api-access-k7rl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.437480 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-scripts" (OuterVolumeSpecName: "scripts") pod "76c135ec-9d78-4745-b065-e035c34fc51c" (UID: "76c135ec-9d78-4745-b065-e035c34fc51c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.460875 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76c135ec-9d78-4745-b065-e035c34fc51c" (UID: "76c135ec-9d78-4745-b065-e035c34fc51c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.465606 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-config-data" (OuterVolumeSpecName: "config-data") pod "76c135ec-9d78-4745-b065-e035c34fc51c" (UID: "76c135ec-9d78-4745-b065-e035c34fc51c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.533914 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.533958 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7rl7\" (UniqueName: \"kubernetes.io/projected/76c135ec-9d78-4745-b065-e035c34fc51c-kube-api-access-k7rl7\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.533970 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.533978 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76c135ec-9d78-4745-b065-e035c34fc51c-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.912822 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6jf29" event={"ID":"76c135ec-9d78-4745-b065-e035c34fc51c","Type":"ContainerDied","Data":"6788598a1765daf7d00fec6cd368e576ed0e5eec263667e39761f4a29c9a50c6"} Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.913132 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6788598a1765daf7d00fec6cd368e576ed0e5eec263667e39761f4a29c9a50c6" Oct 06 13:24:06 crc kubenswrapper[4867]: I1006 13:24:06.912873 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6jf29" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.072741 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 13:24:07 crc kubenswrapper[4867]: E1006 13:24:07.073218 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76c135ec-9d78-4745-b065-e035c34fc51c" containerName="nova-cell0-conductor-db-sync" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.073236 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="76c135ec-9d78-4745-b065-e035c34fc51c" containerName="nova-cell0-conductor-db-sync" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.073462 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="76c135ec-9d78-4745-b065-e035c34fc51c" containerName="nova-cell0-conductor-db-sync" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.074174 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.076473 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vqvtf" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.077242 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.084466 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.148656 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8ltp\" (UniqueName: \"kubernetes.io/projected/a1ce9788-66bb-464a-8cb4-a28f43e4228f-kube-api-access-j8ltp\") pod \"nova-cell0-conductor-0\" (UID: \"a1ce9788-66bb-464a-8cb4-a28f43e4228f\") " pod="openstack/nova-cell0-conductor-0" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.148741 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ce9788-66bb-464a-8cb4-a28f43e4228f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a1ce9788-66bb-464a-8cb4-a28f43e4228f\") " pod="openstack/nova-cell0-conductor-0" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.148797 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ce9788-66bb-464a-8cb4-a28f43e4228f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a1ce9788-66bb-464a-8cb4-a28f43e4228f\") " pod="openstack/nova-cell0-conductor-0" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.249851 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8ltp\" (UniqueName: \"kubernetes.io/projected/a1ce9788-66bb-464a-8cb4-a28f43e4228f-kube-api-access-j8ltp\") pod \"nova-cell0-conductor-0\" (UID: \"a1ce9788-66bb-464a-8cb4-a28f43e4228f\") " pod="openstack/nova-cell0-conductor-0" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.249901 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ce9788-66bb-464a-8cb4-a28f43e4228f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a1ce9788-66bb-464a-8cb4-a28f43e4228f\") " pod="openstack/nova-cell0-conductor-0" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.249940 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ce9788-66bb-464a-8cb4-a28f43e4228f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a1ce9788-66bb-464a-8cb4-a28f43e4228f\") " pod="openstack/nova-cell0-conductor-0" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.254909 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ce9788-66bb-464a-8cb4-a28f43e4228f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a1ce9788-66bb-464a-8cb4-a28f43e4228f\") " pod="openstack/nova-cell0-conductor-0" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.255273 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ce9788-66bb-464a-8cb4-a28f43e4228f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a1ce9788-66bb-464a-8cb4-a28f43e4228f\") " pod="openstack/nova-cell0-conductor-0" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.271729 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8ltp\" (UniqueName: \"kubernetes.io/projected/a1ce9788-66bb-464a-8cb4-a28f43e4228f-kube-api-access-j8ltp\") pod \"nova-cell0-conductor-0\" (UID: \"a1ce9788-66bb-464a-8cb4-a28f43e4228f\") " pod="openstack/nova-cell0-conductor-0" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.429615 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 13:24:07 crc kubenswrapper[4867]: I1006 13:24:07.915169 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 13:24:08 crc kubenswrapper[4867]: I1006 13:24:08.934537 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a1ce9788-66bb-464a-8cb4-a28f43e4228f","Type":"ContainerStarted","Data":"801b081ba713723b9b23242e5a88bc4cc7af436f6d98daaae4300c42f6d2fc63"} Oct 06 13:24:08 crc kubenswrapper[4867]: I1006 13:24:08.935095 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a1ce9788-66bb-464a-8cb4-a28f43e4228f","Type":"ContainerStarted","Data":"1c0b5615337d0cf7f99e44b5e31b1086b73dd607af606da68bf23a2d0448b146"} Oct 06 13:24:08 crc kubenswrapper[4867]: I1006 13:24:08.935182 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 06 13:24:08 crc kubenswrapper[4867]: I1006 13:24:08.955029 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.9550070339999999 podStartE2EDuration="1.955007034s" podCreationTimestamp="2025-10-06 13:24:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:24:08.949654248 +0000 UTC m=+1228.407602392" watchObservedRunningTime="2025-10-06 13:24:08.955007034 +0000 UTC m=+1228.412955178" Oct 06 13:24:12 crc kubenswrapper[4867]: I1006 13:24:12.873344 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:24:12 crc kubenswrapper[4867]: I1006 13:24:12.874113 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:24:17 crc kubenswrapper[4867]: I1006 13:24:17.460438 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.019888 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-6t4g8"] Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.021473 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.039776 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.039887 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.053160 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6t4g8"] Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.120039 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6t4g8\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.120113 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swwgl\" (UniqueName: \"kubernetes.io/projected/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-kube-api-access-swwgl\") pod \"nova-cell0-cell-mapping-6t4g8\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.120209 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-scripts\") pod \"nova-cell0-cell-mapping-6t4g8\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.120264 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-config-data\") pod \"nova-cell0-cell-mapping-6t4g8\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.222422 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6t4g8\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.229308 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swwgl\" (UniqueName: \"kubernetes.io/projected/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-kube-api-access-swwgl\") pod \"nova-cell0-cell-mapping-6t4g8\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.229521 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-scripts\") pod \"nova-cell0-cell-mapping-6t4g8\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.229592 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-config-data\") pod \"nova-cell0-cell-mapping-6t4g8\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.244837 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6t4g8\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.247480 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-scripts\") pod \"nova-cell0-cell-mapping-6t4g8\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.257184 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-config-data\") pod \"nova-cell0-cell-mapping-6t4g8\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.277834 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swwgl\" (UniqueName: \"kubernetes.io/projected/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-kube-api-access-swwgl\") pod \"nova-cell0-cell-mapping-6t4g8\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.331091 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.339416 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.348857 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.349517 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.355882 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.403475 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.404779 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.413764 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.433792 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-logs\") pod \"nova-api-0\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " pod="openstack/nova-api-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.433870 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw6rb\" (UniqueName: \"kubernetes.io/projected/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-kube-api-access-rw6rb\") pod \"nova-api-0\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " pod="openstack/nova-api-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.433949 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " pod="openstack/nova-api-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.433978 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-config-data\") pod \"nova-api-0\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " pod="openstack/nova-api-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.451501 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.535909 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnpmc\" (UniqueName: \"kubernetes.io/projected/74873575-9758-4c41-8cee-d7e71b25e9e3-kube-api-access-dnpmc\") pod \"nova-scheduler-0\" (UID: \"74873575-9758-4c41-8cee-d7e71b25e9e3\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.535966 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74873575-9758-4c41-8cee-d7e71b25e9e3-config-data\") pod \"nova-scheduler-0\" (UID: \"74873575-9758-4c41-8cee-d7e71b25e9e3\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.536110 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-logs\") pod \"nova-api-0\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " pod="openstack/nova-api-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.536168 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw6rb\" (UniqueName: \"kubernetes.io/projected/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-kube-api-access-rw6rb\") pod \"nova-api-0\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " pod="openstack/nova-api-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.536218 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " pod="openstack/nova-api-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.536237 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-config-data\") pod \"nova-api-0\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " pod="openstack/nova-api-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.536298 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74873575-9758-4c41-8cee-d7e71b25e9e3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"74873575-9758-4c41-8cee-d7e71b25e9e3\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.537117 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-logs\") pod \"nova-api-0\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " pod="openstack/nova-api-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.544204 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " pod="openstack/nova-api-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.558639 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.560797 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-config-data\") pod \"nova-api-0\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " pod="openstack/nova-api-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.560999 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.566963 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.576511 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw6rb\" (UniqueName: \"kubernetes.io/projected/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-kube-api-access-rw6rb\") pod \"nova-api-0\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " pod="openstack/nova-api-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.630035 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.638723 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvc4p\" (UniqueName: \"kubernetes.io/projected/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-kube-api-access-vvc4p\") pod \"nova-metadata-0\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " pod="openstack/nova-metadata-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.638781 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74873575-9758-4c41-8cee-d7e71b25e9e3-config-data\") pod \"nova-scheduler-0\" (UID: \"74873575-9758-4c41-8cee-d7e71b25e9e3\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.638844 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " pod="openstack/nova-metadata-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.638923 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-logs\") pod \"nova-metadata-0\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " pod="openstack/nova-metadata-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.638955 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-config-data\") pod \"nova-metadata-0\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " pod="openstack/nova-metadata-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.638990 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74873575-9758-4c41-8cee-d7e71b25e9e3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"74873575-9758-4c41-8cee-d7e71b25e9e3\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.639019 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnpmc\" (UniqueName: \"kubernetes.io/projected/74873575-9758-4c41-8cee-d7e71b25e9e3-kube-api-access-dnpmc\") pod \"nova-scheduler-0\" (UID: \"74873575-9758-4c41-8cee-d7e71b25e9e3\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.678081 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.690163 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74873575-9758-4c41-8cee-d7e71b25e9e3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"74873575-9758-4c41-8cee-d7e71b25e9e3\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.725406 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74873575-9758-4c41-8cee-d7e71b25e9e3-config-data\") pod \"nova-scheduler-0\" (UID: \"74873575-9758-4c41-8cee-d7e71b25e9e3\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.744656 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.747220 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnpmc\" (UniqueName: \"kubernetes.io/projected/74873575-9758-4c41-8cee-d7e71b25e9e3-kube-api-access-dnpmc\") pod \"nova-scheduler-0\" (UID: \"74873575-9758-4c41-8cee-d7e71b25e9e3\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.747830 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvc4p\" (UniqueName: \"kubernetes.io/projected/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-kube-api-access-vvc4p\") pod \"nova-metadata-0\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " pod="openstack/nova-metadata-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.751425 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " pod="openstack/nova-metadata-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.751604 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-logs\") pod \"nova-metadata-0\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " pod="openstack/nova-metadata-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.751674 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-config-data\") pod \"nova-metadata-0\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " pod="openstack/nova-metadata-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.752333 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-logs\") pod \"nova-metadata-0\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " pod="openstack/nova-metadata-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.765861 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " pod="openstack/nova-metadata-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.766310 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-config-data\") pod \"nova-metadata-0\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " pod="openstack/nova-metadata-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.766539 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.776992 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvc4p\" (UniqueName: \"kubernetes.io/projected/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-kube-api-access-vvc4p\") pod \"nova-metadata-0\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " pod="openstack/nova-metadata-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.784057 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.802796 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.844095 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.851683 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d8d7c5479-6cc94"] Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.855764 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjszs\" (UniqueName: \"kubernetes.io/projected/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-kube-api-access-vjszs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.855834 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.855872 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.859921 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.867415 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d8d7c5479-6cc94"] Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.958216 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.959712 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.959800 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s62b5\" (UniqueName: \"kubernetes.io/projected/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-kube-api-access-s62b5\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.959942 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-ovsdbserver-sb\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.959982 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-config\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.960035 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-ovsdbserver-nb\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.960051 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-dns-svc\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.960106 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjszs\" (UniqueName: \"kubernetes.io/projected/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-kube-api-access-vjszs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.960128 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-dns-swift-storage-0\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.960177 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.968519 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.969302 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:18 crc kubenswrapper[4867]: I1006 13:24:18.994338 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjszs\" (UniqueName: \"kubernetes.io/projected/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-kube-api-access-vjszs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.061538 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-ovsdbserver-sb\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.061591 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-config\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.061631 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-ovsdbserver-nb\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.061649 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-dns-svc\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.061687 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-dns-swift-storage-0\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.061748 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s62b5\" (UniqueName: \"kubernetes.io/projected/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-kube-api-access-s62b5\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.063202 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-ovsdbserver-sb\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.063712 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-config\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.064192 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-ovsdbserver-nb\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.065057 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-dns-svc\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.071341 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-dns-swift-storage-0\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.100425 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s62b5\" (UniqueName: \"kubernetes.io/projected/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-kube-api-access-s62b5\") pod \"dnsmasq-dns-d8d7c5479-6cc94\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.147283 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.190049 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6t4g8"] Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.214361 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.221350 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.527830 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-45s42"] Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.533175 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.536534 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.540321 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.542345 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-45s42"] Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.707208 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-45s42\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.707832 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-config-data\") pod \"nova-cell1-conductor-db-sync-45s42\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.707879 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmsnx\" (UniqueName: \"kubernetes.io/projected/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-kube-api-access-xmsnx\") pod \"nova-cell1-conductor-db-sync-45s42\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.708409 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-scripts\") pod \"nova-cell1-conductor-db-sync-45s42\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.710107 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:24:19 crc kubenswrapper[4867]: W1006 13:24:19.716187 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74873575_9758_4c41_8cee_d7e71b25e9e3.slice/crio-b1ead17fe57880bc801fab2fd6f77757b64021ae64494d8e7496972be48a2504 WatchSource:0}: Error finding container b1ead17fe57880bc801fab2fd6f77757b64021ae64494d8e7496972be48a2504: Status 404 returned error can't find the container with id b1ead17fe57880bc801fab2fd6f77757b64021ae64494d8e7496972be48a2504 Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.810788 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-scripts\") pod \"nova-cell1-conductor-db-sync-45s42\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.810951 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-45s42\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.810977 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-config-data\") pod \"nova-cell1-conductor-db-sync-45s42\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.811019 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmsnx\" (UniqueName: \"kubernetes.io/projected/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-kube-api-access-xmsnx\") pod \"nova-cell1-conductor-db-sync-45s42\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.820468 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-45s42\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.820837 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-scripts\") pod \"nova-cell1-conductor-db-sync-45s42\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.821964 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-config-data\") pod \"nova-cell1-conductor-db-sync-45s42\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.831331 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmsnx\" (UniqueName: \"kubernetes.io/projected/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-kube-api-access-xmsnx\") pod \"nova-cell1-conductor-db-sync-45s42\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.864940 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:19 crc kubenswrapper[4867]: I1006 13:24:19.921886 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:19 crc kubenswrapper[4867]: W1006 13:24:19.934678 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d464edf_53e5_436d_a4f3_a760e2f4e4aa.slice/crio-2be44df9025b6a050eb1c7c35ffd8abbefff7904d76d53143d97819cb6c9c92f WatchSource:0}: Error finding container 2be44df9025b6a050eb1c7c35ffd8abbefff7904d76d53143d97819cb6c9c92f: Status 404 returned error can't find the container with id 2be44df9025b6a050eb1c7c35ffd8abbefff7904d76d53143d97819cb6c9c92f Oct 06 13:24:20 crc kubenswrapper[4867]: I1006 13:24:20.007495 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 13:24:20 crc kubenswrapper[4867]: I1006 13:24:20.099054 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d8d7c5479-6cc94"] Oct 06 13:24:20 crc kubenswrapper[4867]: I1006 13:24:20.164906 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4","Type":"ContainerStarted","Data":"45886262c011a119ed5994e577c47b43fef3cd025c46015c59560e72b5998471"} Oct 06 13:24:20 crc kubenswrapper[4867]: I1006 13:24:20.166025 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"74873575-9758-4c41-8cee-d7e71b25e9e3","Type":"ContainerStarted","Data":"b1ead17fe57880bc801fab2fd6f77757b64021ae64494d8e7496972be48a2504"} Oct 06 13:24:20 crc kubenswrapper[4867]: I1006 13:24:20.167618 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d464edf-53e5-436d-a4f3-a760e2f4e4aa","Type":"ContainerStarted","Data":"2be44df9025b6a050eb1c7c35ffd8abbefff7904d76d53143d97819cb6c9c92f"} Oct 06 13:24:20 crc kubenswrapper[4867]: I1006 13:24:20.170622 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6t4g8" event={"ID":"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de","Type":"ContainerStarted","Data":"36e9525b08a2a448d75104b777e67afd6e6fbbd4a8f03f3f9412b4db00b869d0"} Oct 06 13:24:20 crc kubenswrapper[4867]: I1006 13:24:20.170673 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6t4g8" event={"ID":"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de","Type":"ContainerStarted","Data":"0993a59a9c6e5561644aca02a78772ad1f3e2ff9dae62cf666194725f00184e5"} Oct 06 13:24:20 crc kubenswrapper[4867]: I1006 13:24:20.172796 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30","Type":"ContainerStarted","Data":"77f9d5df6f74646c4f7fd1634e8b45dc7616f8bf1a170fd7ce666f21a7cbad85"} Oct 06 13:24:20 crc kubenswrapper[4867]: I1006 13:24:20.195979 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-6t4g8" podStartSLOduration=2.195958984 podStartE2EDuration="2.195958984s" podCreationTimestamp="2025-10-06 13:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:24:20.188728436 +0000 UTC m=+1239.646676600" watchObservedRunningTime="2025-10-06 13:24:20.195958984 +0000 UTC m=+1239.653907128" Oct 06 13:24:20 crc kubenswrapper[4867]: I1006 13:24:20.418858 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-45s42"] Oct 06 13:24:20 crc kubenswrapper[4867]: W1006 13:24:20.426339 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f6b405a_2a1a_4e35_b7e4_b5067e48fe18.slice/crio-c09330d322f3bcc50320b275237cf4ba73e82c31fdb4ec9e7011bb7697e1f81d WatchSource:0}: Error finding container c09330d322f3bcc50320b275237cf4ba73e82c31fdb4ec9e7011bb7697e1f81d: Status 404 returned error can't find the container with id c09330d322f3bcc50320b275237cf4ba73e82c31fdb4ec9e7011bb7697e1f81d Oct 06 13:24:21 crc kubenswrapper[4867]: I1006 13:24:21.192941 4867 generic.go:334] "Generic (PLEG): container finished" podID="1e0330c8-1ee8-44f5-9b86-7c2f242c1294" containerID="f2051c4d693a53f5b826a3414556155e042e4c81995d4ff872345a6846a0fe5c" exitCode=0 Oct 06 13:24:21 crc kubenswrapper[4867]: I1006 13:24:21.194081 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" event={"ID":"1e0330c8-1ee8-44f5-9b86-7c2f242c1294","Type":"ContainerDied","Data":"f2051c4d693a53f5b826a3414556155e042e4c81995d4ff872345a6846a0fe5c"} Oct 06 13:24:21 crc kubenswrapper[4867]: I1006 13:24:21.194128 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" event={"ID":"1e0330c8-1ee8-44f5-9b86-7c2f242c1294","Type":"ContainerStarted","Data":"675a8217476c48fa3500b0f2724406b982c7e6aa8adee5b9cba423f79b511c0b"} Oct 06 13:24:21 crc kubenswrapper[4867]: I1006 13:24:21.208868 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-45s42" event={"ID":"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18","Type":"ContainerStarted","Data":"9af65740392f896f693d7a00a116eee6637ce08de122e10313b26cc6d49e3251"} Oct 06 13:24:21 crc kubenswrapper[4867]: I1006 13:24:21.208958 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-45s42" event={"ID":"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18","Type":"ContainerStarted","Data":"c09330d322f3bcc50320b275237cf4ba73e82c31fdb4ec9e7011bb7697e1f81d"} Oct 06 13:24:21 crc kubenswrapper[4867]: I1006 13:24:21.442671 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-45s42" podStartSLOduration=2.442633612 podStartE2EDuration="2.442633612s" podCreationTimestamp="2025-10-06 13:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:24:21.396587483 +0000 UTC m=+1240.854535647" watchObservedRunningTime="2025-10-06 13:24:21.442633612 +0000 UTC m=+1240.900581746" Oct 06 13:24:22 crc kubenswrapper[4867]: I1006 13:24:22.381694 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 13:24:22 crc kubenswrapper[4867]: I1006 13:24:22.411847 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:23 crc kubenswrapper[4867]: I1006 13:24:23.415217 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.262806 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" event={"ID":"1e0330c8-1ee8-44f5-9b86-7c2f242c1294","Type":"ContainerStarted","Data":"dea6c6f18f0394e7d92ff0808615df1a3ea310e6a7b5b7a995fdd0a3d720d39e"} Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.263923 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.267422 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d464edf-53e5-436d-a4f3-a760e2f4e4aa","Type":"ContainerStarted","Data":"0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c"} Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.267459 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d464edf-53e5-436d-a4f3-a760e2f4e4aa","Type":"ContainerStarted","Data":"ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053"} Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.267475 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9d464edf-53e5-436d-a4f3-a760e2f4e4aa" containerName="nova-metadata-log" containerID="cri-o://ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053" gracePeriod=30 Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.267547 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9d464edf-53e5-436d-a4f3-a760e2f4e4aa" containerName="nova-metadata-metadata" containerID="cri-o://0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c" gracePeriod=30 Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.290354 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d3d871ba-46f8-465c-8bfe-2fb22ebc9c30" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://6c84ee90f9e00d46d5d07b2ce85650a9e11fb79a3a15b4beff20896f8fb783ae" gracePeriod=30 Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.290492 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30","Type":"ContainerStarted","Data":"6c84ee90f9e00d46d5d07b2ce85650a9e11fb79a3a15b4beff20896f8fb783ae"} Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.299938 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4","Type":"ContainerStarted","Data":"276f463fb21099bff23b1db0e49af7d1f3734f133ddaa600979313edafb3ea02"} Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.300203 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4","Type":"ContainerStarted","Data":"16f0ff0d37051f2f331c686beb7effecded08e76f00222269d1c8b03f7b26a70"} Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.303611 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"74873575-9758-4c41-8cee-d7e71b25e9e3","Type":"ContainerStarted","Data":"914ddb1b5019d0b3a62f70986515467a6bea3defd63215405ff225633654660f"} Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.321930 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.161982658 podStartE2EDuration="6.321909019s" podCreationTimestamp="2025-10-06 13:24:18 +0000 UTC" firstStartedPulling="2025-10-06 13:24:19.938647238 +0000 UTC m=+1239.396595382" lastFinishedPulling="2025-10-06 13:24:23.098573599 +0000 UTC m=+1242.556521743" observedRunningTime="2025-10-06 13:24:24.310645041 +0000 UTC m=+1243.768593195" watchObservedRunningTime="2025-10-06 13:24:24.321909019 +0000 UTC m=+1243.779857163" Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.323724 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" podStartSLOduration=6.323716478 podStartE2EDuration="6.323716478s" podCreationTimestamp="2025-10-06 13:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:24:24.290449619 +0000 UTC m=+1243.748397783" watchObservedRunningTime="2025-10-06 13:24:24.323716478 +0000 UTC m=+1243.781664622" Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.345998 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.251725812 podStartE2EDuration="6.345980197s" podCreationTimestamp="2025-10-06 13:24:18 +0000 UTC" firstStartedPulling="2025-10-06 13:24:20.016565049 +0000 UTC m=+1239.474513193" lastFinishedPulling="2025-10-06 13:24:23.110819434 +0000 UTC m=+1242.568767578" observedRunningTime="2025-10-06 13:24:24.331982264 +0000 UTC m=+1243.789930408" watchObservedRunningTime="2025-10-06 13:24:24.345980197 +0000 UTC m=+1243.803928331" Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.356496 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.979650452 podStartE2EDuration="6.356475204s" podCreationTimestamp="2025-10-06 13:24:18 +0000 UTC" firstStartedPulling="2025-10-06 13:24:19.721777128 +0000 UTC m=+1239.179725272" lastFinishedPulling="2025-10-06 13:24:23.09860188 +0000 UTC m=+1242.556550024" observedRunningTime="2025-10-06 13:24:24.353186204 +0000 UTC m=+1243.811134348" watchObservedRunningTime="2025-10-06 13:24:24.356475204 +0000 UTC m=+1243.814423348" Oct 06 13:24:24 crc kubenswrapper[4867]: I1006 13:24:24.378068 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.677284425 podStartE2EDuration="6.378043924s" podCreationTimestamp="2025-10-06 13:24:18 +0000 UTC" firstStartedPulling="2025-10-06 13:24:19.414673541 +0000 UTC m=+1238.872621675" lastFinishedPulling="2025-10-06 13:24:23.11543303 +0000 UTC m=+1242.573381174" observedRunningTime="2025-10-06 13:24:24.371901186 +0000 UTC m=+1243.829849330" watchObservedRunningTime="2025-10-06 13:24:24.378043924 +0000 UTC m=+1243.835992068" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.068192 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.143644 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-logs\") pod \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.144090 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-config-data\") pod \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.144125 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-logs" (OuterVolumeSpecName: "logs") pod "9d464edf-53e5-436d-a4f3-a760e2f4e4aa" (UID: "9d464edf-53e5-436d-a4f3-a760e2f4e4aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.144173 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvc4p\" (UniqueName: \"kubernetes.io/projected/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-kube-api-access-vvc4p\") pod \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.144279 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-combined-ca-bundle\") pod \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\" (UID: \"9d464edf-53e5-436d-a4f3-a760e2f4e4aa\") " Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.145630 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.180450 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-kube-api-access-vvc4p" (OuterVolumeSpecName: "kube-api-access-vvc4p") pod "9d464edf-53e5-436d-a4f3-a760e2f4e4aa" (UID: "9d464edf-53e5-436d-a4f3-a760e2f4e4aa"). InnerVolumeSpecName "kube-api-access-vvc4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.180711 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-config-data" (OuterVolumeSpecName: "config-data") pod "9d464edf-53e5-436d-a4f3-a760e2f4e4aa" (UID: "9d464edf-53e5-436d-a4f3-a760e2f4e4aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.185010 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d464edf-53e5-436d-a4f3-a760e2f4e4aa" (UID: "9d464edf-53e5-436d-a4f3-a760e2f4e4aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.248618 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.248665 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvc4p\" (UniqueName: \"kubernetes.io/projected/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-kube-api-access-vvc4p\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.248681 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d464edf-53e5-436d-a4f3-a760e2f4e4aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.319008 4867 generic.go:334] "Generic (PLEG): container finished" podID="9d464edf-53e5-436d-a4f3-a760e2f4e4aa" containerID="0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c" exitCode=0 Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.319049 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.319084 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d464edf-53e5-436d-a4f3-a760e2f4e4aa","Type":"ContainerDied","Data":"0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c"} Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.319141 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d464edf-53e5-436d-a4f3-a760e2f4e4aa","Type":"ContainerDied","Data":"ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053"} Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.319166 4867 scope.go:117] "RemoveContainer" containerID="0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.319454 4867 generic.go:334] "Generic (PLEG): container finished" podID="9d464edf-53e5-436d-a4f3-a760e2f4e4aa" containerID="ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053" exitCode=143 Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.319634 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d464edf-53e5-436d-a4f3-a760e2f4e4aa","Type":"ContainerDied","Data":"2be44df9025b6a050eb1c7c35ffd8abbefff7904d76d53143d97819cb6c9c92f"} Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.353132 4867 scope.go:117] "RemoveContainer" containerID="ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.353280 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.376194 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.381695 4867 scope.go:117] "RemoveContainer" containerID="0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c" Oct 06 13:24:25 crc kubenswrapper[4867]: E1006 13:24:25.382361 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c\": container with ID starting with 0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c not found: ID does not exist" containerID="0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.382427 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c"} err="failed to get container status \"0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c\": rpc error: code = NotFound desc = could not find container \"0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c\": container with ID starting with 0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c not found: ID does not exist" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.382464 4867 scope.go:117] "RemoveContainer" containerID="ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053" Oct 06 13:24:25 crc kubenswrapper[4867]: E1006 13:24:25.382952 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053\": container with ID starting with ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053 not found: ID does not exist" containerID="ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.382998 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053"} err="failed to get container status \"ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053\": rpc error: code = NotFound desc = could not find container \"ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053\": container with ID starting with ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053 not found: ID does not exist" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.383028 4867 scope.go:117] "RemoveContainer" containerID="0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.383289 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c"} err="failed to get container status \"0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c\": rpc error: code = NotFound desc = could not find container \"0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c\": container with ID starting with 0aefc251161e1ee00dca5a1900dd121f93371b45cc1af139f07be82380bfb33c not found: ID does not exist" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.383313 4867 scope.go:117] "RemoveContainer" containerID="ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.384229 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053"} err="failed to get container status \"ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053\": rpc error: code = NotFound desc = could not find container \"ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053\": container with ID starting with ba385498c39a02a9ca06eb4ac1e81049807019efd8b7327c9ab4714c04441053 not found: ID does not exist" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.390609 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:25 crc kubenswrapper[4867]: E1006 13:24:25.391312 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d464edf-53e5-436d-a4f3-a760e2f4e4aa" containerName="nova-metadata-metadata" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.391338 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d464edf-53e5-436d-a4f3-a760e2f4e4aa" containerName="nova-metadata-metadata" Oct 06 13:24:25 crc kubenswrapper[4867]: E1006 13:24:25.391357 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d464edf-53e5-436d-a4f3-a760e2f4e4aa" containerName="nova-metadata-log" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.391365 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d464edf-53e5-436d-a4f3-a760e2f4e4aa" containerName="nova-metadata-log" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.391625 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d464edf-53e5-436d-a4f3-a760e2f4e4aa" containerName="nova-metadata-log" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.391655 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d464edf-53e5-436d-a4f3-a760e2f4e4aa" containerName="nova-metadata-metadata" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.393058 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.401167 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.402407 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.408548 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.456019 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-config-data\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.456164 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2mjv\" (UniqueName: \"kubernetes.io/projected/338211e8-4b3f-458f-a37c-812c46dca850-kube-api-access-z2mjv\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.456350 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/338211e8-4b3f-458f-a37c-812c46dca850-logs\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.456397 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.456516 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.558974 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.559278 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-config-data\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.559456 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2mjv\" (UniqueName: \"kubernetes.io/projected/338211e8-4b3f-458f-a37c-812c46dca850-kube-api-access-z2mjv\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.559909 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/338211e8-4b3f-458f-a37c-812c46dca850-logs\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.560320 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.560434 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/338211e8-4b3f-458f-a37c-812c46dca850-logs\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.577099 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-config-data\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.578234 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.579886 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.583587 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2mjv\" (UniqueName: \"kubernetes.io/projected/338211e8-4b3f-458f-a37c-812c46dca850-kube-api-access-z2mjv\") pod \"nova-metadata-0\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " pod="openstack/nova-metadata-0" Oct 06 13:24:25 crc kubenswrapper[4867]: I1006 13:24:25.695389 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 13:24:26 crc kubenswrapper[4867]: I1006 13:24:26.281957 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:26 crc kubenswrapper[4867]: W1006 13:24:26.291296 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod338211e8_4b3f_458f_a37c_812c46dca850.slice/crio-d6c5a6ce6bfb171c25c5a6ba47e7a75cb5597ff41ba6e4d01162b2cbf8f8afa3 WatchSource:0}: Error finding container d6c5a6ce6bfb171c25c5a6ba47e7a75cb5597ff41ba6e4d01162b2cbf8f8afa3: Status 404 returned error can't find the container with id d6c5a6ce6bfb171c25c5a6ba47e7a75cb5597ff41ba6e4d01162b2cbf8f8afa3 Oct 06 13:24:26 crc kubenswrapper[4867]: I1006 13:24:26.348826 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"338211e8-4b3f-458f-a37c-812c46dca850","Type":"ContainerStarted","Data":"d6c5a6ce6bfb171c25c5a6ba47e7a75cb5597ff41ba6e4d01162b2cbf8f8afa3"} Oct 06 13:24:27 crc kubenswrapper[4867]: I1006 13:24:27.243808 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d464edf-53e5-436d-a4f3-a760e2f4e4aa" path="/var/lib/kubelet/pods/9d464edf-53e5-436d-a4f3-a760e2f4e4aa/volumes" Oct 06 13:24:27 crc kubenswrapper[4867]: I1006 13:24:27.370899 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"338211e8-4b3f-458f-a37c-812c46dca850","Type":"ContainerStarted","Data":"47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094"} Oct 06 13:24:27 crc kubenswrapper[4867]: I1006 13:24:27.370959 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"338211e8-4b3f-458f-a37c-812c46dca850","Type":"ContainerStarted","Data":"7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5"} Oct 06 13:24:27 crc kubenswrapper[4867]: I1006 13:24:27.401036 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.4010195899999998 podStartE2EDuration="2.40101959s" podCreationTimestamp="2025-10-06 13:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:24:27.39990943 +0000 UTC m=+1246.857857584" watchObservedRunningTime="2025-10-06 13:24:27.40101959 +0000 UTC m=+1246.858967734" Oct 06 13:24:28 crc kubenswrapper[4867]: I1006 13:24:28.011223 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 13:24:28 crc kubenswrapper[4867]: I1006 13:24:28.011737 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="469e79f5-1d34-4151-ae0b-81301742c10c" containerName="kube-state-metrics" containerID="cri-o://718eb30cf43a0ebe2a9c1cade0cd6ce4f16a1a0abf514b09678ed88ccfe2febe" gracePeriod=30 Oct 06 13:24:28 crc kubenswrapper[4867]: I1006 13:24:28.381106 4867 generic.go:334] "Generic (PLEG): container finished" podID="469e79f5-1d34-4151-ae0b-81301742c10c" containerID="718eb30cf43a0ebe2a9c1cade0cd6ce4f16a1a0abf514b09678ed88ccfe2febe" exitCode=2 Oct 06 13:24:28 crc kubenswrapper[4867]: I1006 13:24:28.381194 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"469e79f5-1d34-4151-ae0b-81301742c10c","Type":"ContainerDied","Data":"718eb30cf43a0ebe2a9c1cade0cd6ce4f16a1a0abf514b09678ed88ccfe2febe"} Oct 06 13:24:28 crc kubenswrapper[4867]: I1006 13:24:28.523243 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 13:24:28 crc kubenswrapper[4867]: I1006 13:24:28.653940 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4j7j\" (UniqueName: \"kubernetes.io/projected/469e79f5-1d34-4151-ae0b-81301742c10c-kube-api-access-t4j7j\") pod \"469e79f5-1d34-4151-ae0b-81301742c10c\" (UID: \"469e79f5-1d34-4151-ae0b-81301742c10c\") " Oct 06 13:24:28 crc kubenswrapper[4867]: I1006 13:24:28.660013 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/469e79f5-1d34-4151-ae0b-81301742c10c-kube-api-access-t4j7j" (OuterVolumeSpecName: "kube-api-access-t4j7j") pod "469e79f5-1d34-4151-ae0b-81301742c10c" (UID: "469e79f5-1d34-4151-ae0b-81301742c10c"). InnerVolumeSpecName "kube-api-access-t4j7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:24:28 crc kubenswrapper[4867]: I1006 13:24:28.679970 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 13:24:28 crc kubenswrapper[4867]: I1006 13:24:28.691601 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 13:24:28 crc kubenswrapper[4867]: I1006 13:24:28.756625 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4j7j\" (UniqueName: \"kubernetes.io/projected/469e79f5-1d34-4151-ae0b-81301742c10c-kube-api-access-t4j7j\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:28 crc kubenswrapper[4867]: I1006 13:24:28.845532 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 13:24:28 crc kubenswrapper[4867]: I1006 13:24:28.845576 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 13:24:28 crc kubenswrapper[4867]: I1006 13:24:28.877964 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.148221 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.216974 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.285523 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76747ff567-27q8x"] Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.286339 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76747ff567-27q8x" podUID="63a14623-32a3-4753-8626-4ffba880aced" containerName="dnsmasq-dns" containerID="cri-o://9efff486e50a281ec4e3c3b9964b58b5eee9274b85abc83bcdb27f19de00bc7f" gracePeriod=10 Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.404743 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"469e79f5-1d34-4151-ae0b-81301742c10c","Type":"ContainerDied","Data":"0a1c38f4628d485fa378ca9cdfea567c0ab3ac7fa6c60d75b5b9e35db5cae979"} Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.404820 4867 scope.go:117] "RemoveContainer" containerID="718eb30cf43a0ebe2a9c1cade0cd6ce4f16a1a0abf514b09678ed88ccfe2febe" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.405034 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.413523 4867 generic.go:334] "Generic (PLEG): container finished" podID="6e0f139d-a521-4cc0-95cd-76e5f7f4d8de" containerID="36e9525b08a2a448d75104b777e67afd6e6fbbd4a8f03f3f9412b4db00b869d0" exitCode=0 Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.413577 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6t4g8" event={"ID":"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de","Type":"ContainerDied","Data":"36e9525b08a2a448d75104b777e67afd6e6fbbd4a8f03f3f9412b4db00b869d0"} Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.448976 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.553553 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.561394 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.594765 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 13:24:29 crc kubenswrapper[4867]: E1006 13:24:29.595363 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="469e79f5-1d34-4151-ae0b-81301742c10c" containerName="kube-state-metrics" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.595388 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="469e79f5-1d34-4151-ae0b-81301742c10c" containerName="kube-state-metrics" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.595683 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="469e79f5-1d34-4151-ae0b-81301742c10c" containerName="kube-state-metrics" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.596649 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.599278 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.599306 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.605186 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.673582 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/feb72ccb-56bd-433d-b82c-6002fed1e09d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"feb72ccb-56bd-433d-b82c-6002fed1e09d\") " pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.673733 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb72ccb-56bd-433d-b82c-6002fed1e09d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"feb72ccb-56bd-433d-b82c-6002fed1e09d\") " pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.673803 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb72ccb-56bd-433d-b82c-6002fed1e09d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"feb72ccb-56bd-433d-b82c-6002fed1e09d\") " pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.673832 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr54m\" (UniqueName: \"kubernetes.io/projected/feb72ccb-56bd-433d-b82c-6002fed1e09d-kube-api-access-dr54m\") pod \"kube-state-metrics-0\" (UID: \"feb72ccb-56bd-433d-b82c-6002fed1e09d\") " pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.721815 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.762487 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.775739 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb72ccb-56bd-433d-b82c-6002fed1e09d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"feb72ccb-56bd-433d-b82c-6002fed1e09d\") " pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.775831 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb72ccb-56bd-433d-b82c-6002fed1e09d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"feb72ccb-56bd-433d-b82c-6002fed1e09d\") " pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.775859 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr54m\" (UniqueName: \"kubernetes.io/projected/feb72ccb-56bd-433d-b82c-6002fed1e09d-kube-api-access-dr54m\") pod \"kube-state-metrics-0\" (UID: \"feb72ccb-56bd-433d-b82c-6002fed1e09d\") " pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.775910 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/feb72ccb-56bd-433d-b82c-6002fed1e09d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"feb72ccb-56bd-433d-b82c-6002fed1e09d\") " pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.784293 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/feb72ccb-56bd-433d-b82c-6002fed1e09d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"feb72ccb-56bd-433d-b82c-6002fed1e09d\") " pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.790883 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb72ccb-56bd-433d-b82c-6002fed1e09d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"feb72ccb-56bd-433d-b82c-6002fed1e09d\") " pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.794433 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb72ccb-56bd-433d-b82c-6002fed1e09d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"feb72ccb-56bd-433d-b82c-6002fed1e09d\") " pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.800572 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr54m\" (UniqueName: \"kubernetes.io/projected/feb72ccb-56bd-433d-b82c-6002fed1e09d-kube-api-access-dr54m\") pod \"kube-state-metrics-0\" (UID: \"feb72ccb-56bd-433d-b82c-6002fed1e09d\") " pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.897155 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.916807 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.993525 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkxjn\" (UniqueName: \"kubernetes.io/projected/63a14623-32a3-4753-8626-4ffba880aced-kube-api-access-pkxjn\") pod \"63a14623-32a3-4753-8626-4ffba880aced\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.993688 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-ovsdbserver-sb\") pod \"63a14623-32a3-4753-8626-4ffba880aced\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.993756 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-config\") pod \"63a14623-32a3-4753-8626-4ffba880aced\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.993790 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-ovsdbserver-nb\") pod \"63a14623-32a3-4753-8626-4ffba880aced\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.993930 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-dns-svc\") pod \"63a14623-32a3-4753-8626-4ffba880aced\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " Oct 06 13:24:29 crc kubenswrapper[4867]: I1006 13:24:29.994005 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-dns-swift-storage-0\") pod \"63a14623-32a3-4753-8626-4ffba880aced\" (UID: \"63a14623-32a3-4753-8626-4ffba880aced\") " Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.026464 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63a14623-32a3-4753-8626-4ffba880aced-kube-api-access-pkxjn" (OuterVolumeSpecName: "kube-api-access-pkxjn") pod "63a14623-32a3-4753-8626-4ffba880aced" (UID: "63a14623-32a3-4753-8626-4ffba880aced"). InnerVolumeSpecName "kube-api-access-pkxjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.081893 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "63a14623-32a3-4753-8626-4ffba880aced" (UID: "63a14623-32a3-4753-8626-4ffba880aced"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.098494 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.098528 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkxjn\" (UniqueName: \"kubernetes.io/projected/63a14623-32a3-4753-8626-4ffba880aced-kube-api-access-pkxjn\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.117443 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "63a14623-32a3-4753-8626-4ffba880aced" (UID: "63a14623-32a3-4753-8626-4ffba880aced"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.152973 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-config" (OuterVolumeSpecName: "config") pod "63a14623-32a3-4753-8626-4ffba880aced" (UID: "63a14623-32a3-4753-8626-4ffba880aced"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.188094 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "63a14623-32a3-4753-8626-4ffba880aced" (UID: "63a14623-32a3-4753-8626-4ffba880aced"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.200930 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.201325 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.201365 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.226466 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "63a14623-32a3-4753-8626-4ffba880aced" (UID: "63a14623-32a3-4753-8626-4ffba880aced"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.302663 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63a14623-32a3-4753-8626-4ffba880aced-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.426045 4867 generic.go:334] "Generic (PLEG): container finished" podID="63a14623-32a3-4753-8626-4ffba880aced" containerID="9efff486e50a281ec4e3c3b9964b58b5eee9274b85abc83bcdb27f19de00bc7f" exitCode=0 Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.426119 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76747ff567-27q8x" event={"ID":"63a14623-32a3-4753-8626-4ffba880aced","Type":"ContainerDied","Data":"9efff486e50a281ec4e3c3b9964b58b5eee9274b85abc83bcdb27f19de00bc7f"} Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.426148 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76747ff567-27q8x" event={"ID":"63a14623-32a3-4753-8626-4ffba880aced","Type":"ContainerDied","Data":"8e9970428f72926ced79545fe2d852d7074790cafe643bab7d940652abfcf1ac"} Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.426167 4867 scope.go:117] "RemoveContainer" containerID="9efff486e50a281ec4e3c3b9964b58b5eee9274b85abc83bcdb27f19de00bc7f" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.426184 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76747ff567-27q8x" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.486939 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76747ff567-27q8x"] Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.495187 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76747ff567-27q8x"] Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.500385 4867 scope.go:117] "RemoveContainer" containerID="8a42f85cba4fe735d646e5460506b5a24e8d45df1f8ece06197a04fca2a9377c" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.540100 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.572222 4867 scope.go:117] "RemoveContainer" containerID="9efff486e50a281ec4e3c3b9964b58b5eee9274b85abc83bcdb27f19de00bc7f" Oct 06 13:24:30 crc kubenswrapper[4867]: E1006 13:24:30.572913 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9efff486e50a281ec4e3c3b9964b58b5eee9274b85abc83bcdb27f19de00bc7f\": container with ID starting with 9efff486e50a281ec4e3c3b9964b58b5eee9274b85abc83bcdb27f19de00bc7f not found: ID does not exist" containerID="9efff486e50a281ec4e3c3b9964b58b5eee9274b85abc83bcdb27f19de00bc7f" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.572990 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9efff486e50a281ec4e3c3b9964b58b5eee9274b85abc83bcdb27f19de00bc7f"} err="failed to get container status \"9efff486e50a281ec4e3c3b9964b58b5eee9274b85abc83bcdb27f19de00bc7f\": rpc error: code = NotFound desc = could not find container \"9efff486e50a281ec4e3c3b9964b58b5eee9274b85abc83bcdb27f19de00bc7f\": container with ID starting with 9efff486e50a281ec4e3c3b9964b58b5eee9274b85abc83bcdb27f19de00bc7f not found: ID does not exist" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.573035 4867 scope.go:117] "RemoveContainer" containerID="8a42f85cba4fe735d646e5460506b5a24e8d45df1f8ece06197a04fca2a9377c" Oct 06 13:24:30 crc kubenswrapper[4867]: E1006 13:24:30.573596 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a42f85cba4fe735d646e5460506b5a24e8d45df1f8ece06197a04fca2a9377c\": container with ID starting with 8a42f85cba4fe735d646e5460506b5a24e8d45df1f8ece06197a04fca2a9377c not found: ID does not exist" containerID="8a42f85cba4fe735d646e5460506b5a24e8d45df1f8ece06197a04fca2a9377c" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.573641 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a42f85cba4fe735d646e5460506b5a24e8d45df1f8ece06197a04fca2a9377c"} err="failed to get container status \"8a42f85cba4fe735d646e5460506b5a24e8d45df1f8ece06197a04fca2a9377c\": rpc error: code = NotFound desc = could not find container \"8a42f85cba4fe735d646e5460506b5a24e8d45df1f8ece06197a04fca2a9377c\": container with ID starting with 8a42f85cba4fe735d646e5460506b5a24e8d45df1f8ece06197a04fca2a9377c not found: ID does not exist" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.696372 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.697493 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.700979 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.701342 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="ceilometer-central-agent" containerID="cri-o://c57edc48609e65fde5369175114434967c4882943e1979b777657bda75fe0bdb" gracePeriod=30 Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.701491 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="sg-core" containerID="cri-o://1ed25887d16f76930f9b625495510144e00929118f3d31c30806f582ffd83626" gracePeriod=30 Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.701524 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="proxy-httpd" containerID="cri-o://cdfe0ab8b58aa25595ad138ac85533d85ff017b824419a1c68e14ab1dc9bbf02" gracePeriod=30 Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.701970 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="ceilometer-notification-agent" containerID="cri-o://e0a580e6a0daebb9815f50e20686e3bb7251077cab2afa1f34f86b0aa99ae106" gracePeriod=30 Oct 06 13:24:30 crc kubenswrapper[4867]: I1006 13:24:30.916626 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.017677 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-config-data\") pod \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.018108 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-combined-ca-bundle\") pod \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.018211 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swwgl\" (UniqueName: \"kubernetes.io/projected/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-kube-api-access-swwgl\") pod \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.018398 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-scripts\") pod \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\" (UID: \"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de\") " Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.024417 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-scripts" (OuterVolumeSpecName: "scripts") pod "6e0f139d-a521-4cc0-95cd-76e5f7f4d8de" (UID: "6e0f139d-a521-4cc0-95cd-76e5f7f4d8de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.024609 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-kube-api-access-swwgl" (OuterVolumeSpecName: "kube-api-access-swwgl") pod "6e0f139d-a521-4cc0-95cd-76e5f7f4d8de" (UID: "6e0f139d-a521-4cc0-95cd-76e5f7f4d8de"). InnerVolumeSpecName "kube-api-access-swwgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.056760 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e0f139d-a521-4cc0-95cd-76e5f7f4d8de" (UID: "6e0f139d-a521-4cc0-95cd-76e5f7f4d8de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.062414 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-config-data" (OuterVolumeSpecName: "config-data") pod "6e0f139d-a521-4cc0-95cd-76e5f7f4d8de" (UID: "6e0f139d-a521-4cc0-95cd-76e5f7f4d8de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.120877 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.120928 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.120939 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swwgl\" (UniqueName: \"kubernetes.io/projected/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-kube-api-access-swwgl\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.120949 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.247498 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="469e79f5-1d34-4151-ae0b-81301742c10c" path="/var/lib/kubelet/pods/469e79f5-1d34-4151-ae0b-81301742c10c/volumes" Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.248309 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63a14623-32a3-4753-8626-4ffba880aced" path="/var/lib/kubelet/pods/63a14623-32a3-4753-8626-4ffba880aced/volumes" Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.470978 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6t4g8" event={"ID":"6e0f139d-a521-4cc0-95cd-76e5f7f4d8de","Type":"ContainerDied","Data":"0993a59a9c6e5561644aca02a78772ad1f3e2ff9dae62cf666194725f00184e5"} Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.471368 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0993a59a9c6e5561644aca02a78772ad1f3e2ff9dae62cf666194725f00184e5" Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.471443 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6t4g8" Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.476678 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f6b405a-2a1a-4e35-b7e4-b5067e48fe18" containerID="9af65740392f896f693d7a00a116eee6637ce08de122e10313b26cc6d49e3251" exitCode=0 Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.476752 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-45s42" event={"ID":"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18","Type":"ContainerDied","Data":"9af65740392f896f693d7a00a116eee6637ce08de122e10313b26cc6d49e3251"} Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.477852 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"feb72ccb-56bd-433d-b82c-6002fed1e09d","Type":"ContainerStarted","Data":"5f07eb9ed80bef5466b4746daf187fbd5723b48f2d7754aa240d2a87761e5312"} Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.484423 4867 generic.go:334] "Generic (PLEG): container finished" podID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerID="cdfe0ab8b58aa25595ad138ac85533d85ff017b824419a1c68e14ab1dc9bbf02" exitCode=0 Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.484458 4867 generic.go:334] "Generic (PLEG): container finished" podID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerID="1ed25887d16f76930f9b625495510144e00929118f3d31c30806f582ffd83626" exitCode=2 Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.484468 4867 generic.go:334] "Generic (PLEG): container finished" podID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerID="c57edc48609e65fde5369175114434967c4882943e1979b777657bda75fe0bdb" exitCode=0 Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.485338 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64fb7edb-92e9-4b44-b05a-667d470f7b10","Type":"ContainerDied","Data":"cdfe0ab8b58aa25595ad138ac85533d85ff017b824419a1c68e14ab1dc9bbf02"} Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.485367 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64fb7edb-92e9-4b44-b05a-667d470f7b10","Type":"ContainerDied","Data":"1ed25887d16f76930f9b625495510144e00929118f3d31c30806f582ffd83626"} Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.485379 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64fb7edb-92e9-4b44-b05a-667d470f7b10","Type":"ContainerDied","Data":"c57edc48609e65fde5369175114434967c4882943e1979b777657bda75fe0bdb"} Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.635578 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.635822 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" containerName="nova-api-log" containerID="cri-o://16f0ff0d37051f2f331c686beb7effecded08e76f00222269d1c8b03f7b26a70" gracePeriod=30 Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.635959 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" containerName="nova-api-api" containerID="cri-o://276f463fb21099bff23b1db0e49af7d1f3734f133ddaa600979313edafb3ea02" gracePeriod=30 Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.726218 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:24:31 crc kubenswrapper[4867]: I1006 13:24:31.743294 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:32 crc kubenswrapper[4867]: I1006 13:24:32.495970 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"feb72ccb-56bd-433d-b82c-6002fed1e09d","Type":"ContainerStarted","Data":"13926db1a311866137c8f960bfd1ea44ad15ffcc14f574070eb7d8f38674228d"} Oct 06 13:24:32 crc kubenswrapper[4867]: I1006 13:24:32.496337 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 13:24:32 crc kubenswrapper[4867]: I1006 13:24:32.498762 4867 generic.go:334] "Generic (PLEG): container finished" podID="6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" containerID="16f0ff0d37051f2f331c686beb7effecded08e76f00222269d1c8b03f7b26a70" exitCode=143 Oct 06 13:24:32 crc kubenswrapper[4867]: I1006 13:24:32.498871 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4","Type":"ContainerDied","Data":"16f0ff0d37051f2f331c686beb7effecded08e76f00222269d1c8b03f7b26a70"} Oct 06 13:24:32 crc kubenswrapper[4867]: I1006 13:24:32.498976 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="338211e8-4b3f-458f-a37c-812c46dca850" containerName="nova-metadata-log" containerID="cri-o://7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5" gracePeriod=30 Oct 06 13:24:32 crc kubenswrapper[4867]: I1006 13:24:32.499056 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="338211e8-4b3f-458f-a37c-812c46dca850" containerName="nova-metadata-metadata" containerID="cri-o://47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094" gracePeriod=30 Oct 06 13:24:32 crc kubenswrapper[4867]: I1006 13:24:32.499088 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="74873575-9758-4c41-8cee-d7e71b25e9e3" containerName="nova-scheduler-scheduler" containerID="cri-o://914ddb1b5019d0b3a62f70986515467a6bea3defd63215405ff225633654660f" gracePeriod=30 Oct 06 13:24:32 crc kubenswrapper[4867]: I1006 13:24:32.526956 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.217298487 podStartE2EDuration="3.526925387s" podCreationTimestamp="2025-10-06 13:24:29 +0000 UTC" firstStartedPulling="2025-10-06 13:24:30.582537171 +0000 UTC m=+1250.040485315" lastFinishedPulling="2025-10-06 13:24:31.892164071 +0000 UTC m=+1251.350112215" observedRunningTime="2025-10-06 13:24:32.513989663 +0000 UTC m=+1251.971937827" watchObservedRunningTime="2025-10-06 13:24:32.526925387 +0000 UTC m=+1251.984873531" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.011707 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.109099 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-scripts\") pod \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.109225 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-combined-ca-bundle\") pod \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.109377 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-config-data\") pod \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.109602 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmsnx\" (UniqueName: \"kubernetes.io/projected/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-kube-api-access-xmsnx\") pod \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\" (UID: \"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18\") " Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.125356 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-kube-api-access-xmsnx" (OuterVolumeSpecName: "kube-api-access-xmsnx") pod "8f6b405a-2a1a-4e35-b7e4-b5067e48fe18" (UID: "8f6b405a-2a1a-4e35-b7e4-b5067e48fe18"). InnerVolumeSpecName "kube-api-access-xmsnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.156346 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-scripts" (OuterVolumeSpecName: "scripts") pod "8f6b405a-2a1a-4e35-b7e4-b5067e48fe18" (UID: "8f6b405a-2a1a-4e35-b7e4-b5067e48fe18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.216299 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmsnx\" (UniqueName: \"kubernetes.io/projected/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-kube-api-access-xmsnx\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.216343 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.236847 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-config-data" (OuterVolumeSpecName: "config-data") pod "8f6b405a-2a1a-4e35-b7e4-b5067e48fe18" (UID: "8f6b405a-2a1a-4e35-b7e4-b5067e48fe18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.247054 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f6b405a-2a1a-4e35-b7e4-b5067e48fe18" (UID: "8f6b405a-2a1a-4e35-b7e4-b5067e48fe18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.261157 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.317041 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-combined-ca-bundle\") pod \"338211e8-4b3f-458f-a37c-812c46dca850\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.317122 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-config-data\") pod \"338211e8-4b3f-458f-a37c-812c46dca850\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.317304 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2mjv\" (UniqueName: \"kubernetes.io/projected/338211e8-4b3f-458f-a37c-812c46dca850-kube-api-access-z2mjv\") pod \"338211e8-4b3f-458f-a37c-812c46dca850\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.317389 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/338211e8-4b3f-458f-a37c-812c46dca850-logs\") pod \"338211e8-4b3f-458f-a37c-812c46dca850\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.317427 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-nova-metadata-tls-certs\") pod \"338211e8-4b3f-458f-a37c-812c46dca850\" (UID: \"338211e8-4b3f-458f-a37c-812c46dca850\") " Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.317805 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.317824 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.320455 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/338211e8-4b3f-458f-a37c-812c46dca850-logs" (OuterVolumeSpecName: "logs") pod "338211e8-4b3f-458f-a37c-812c46dca850" (UID: "338211e8-4b3f-458f-a37c-812c46dca850"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.324227 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/338211e8-4b3f-458f-a37c-812c46dca850-kube-api-access-z2mjv" (OuterVolumeSpecName: "kube-api-access-z2mjv") pod "338211e8-4b3f-458f-a37c-812c46dca850" (UID: "338211e8-4b3f-458f-a37c-812c46dca850"). InnerVolumeSpecName "kube-api-access-z2mjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.359327 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "338211e8-4b3f-458f-a37c-812c46dca850" (UID: "338211e8-4b3f-458f-a37c-812c46dca850"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.365872 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-config-data" (OuterVolumeSpecName: "config-data") pod "338211e8-4b3f-458f-a37c-812c46dca850" (UID: "338211e8-4b3f-458f-a37c-812c46dca850"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.389778 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "338211e8-4b3f-458f-a37c-812c46dca850" (UID: "338211e8-4b3f-458f-a37c-812c46dca850"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.420621 4867 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.420675 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.420686 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/338211e8-4b3f-458f-a37c-812c46dca850-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.420696 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2mjv\" (UniqueName: \"kubernetes.io/projected/338211e8-4b3f-458f-a37c-812c46dca850-kube-api-access-z2mjv\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.420705 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/338211e8-4b3f-458f-a37c-812c46dca850-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.533600 4867 generic.go:334] "Generic (PLEG): container finished" podID="338211e8-4b3f-458f-a37c-812c46dca850" containerID="47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094" exitCode=0 Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.533640 4867 generic.go:334] "Generic (PLEG): container finished" podID="338211e8-4b3f-458f-a37c-812c46dca850" containerID="7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5" exitCode=143 Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.533700 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"338211e8-4b3f-458f-a37c-812c46dca850","Type":"ContainerDied","Data":"47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094"} Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.533742 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"338211e8-4b3f-458f-a37c-812c46dca850","Type":"ContainerDied","Data":"7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5"} Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.533754 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"338211e8-4b3f-458f-a37c-812c46dca850","Type":"ContainerDied","Data":"d6c5a6ce6bfb171c25c5a6ba47e7a75cb5597ff41ba6e4d01162b2cbf8f8afa3"} Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.533769 4867 scope.go:117] "RemoveContainer" containerID="47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.535290 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.536033 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-45s42" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.536015 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-45s42" event={"ID":"8f6b405a-2a1a-4e35-b7e4-b5067e48fe18","Type":"ContainerDied","Data":"c09330d322f3bcc50320b275237cf4ba73e82c31fdb4ec9e7011bb7697e1f81d"} Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.536078 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c09330d322f3bcc50320b275237cf4ba73e82c31fdb4ec9e7011bb7697e1f81d" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.588270 4867 scope.go:117] "RemoveContainer" containerID="7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.603834 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.619978 4867 scope.go:117] "RemoveContainer" containerID="47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094" Oct 06 13:24:33 crc kubenswrapper[4867]: E1006 13:24:33.620804 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094\": container with ID starting with 47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094 not found: ID does not exist" containerID="47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.620835 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094"} err="failed to get container status \"47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094\": rpc error: code = NotFound desc = could not find container \"47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094\": container with ID starting with 47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094 not found: ID does not exist" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.620858 4867 scope.go:117] "RemoveContainer" containerID="7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5" Oct 06 13:24:33 crc kubenswrapper[4867]: E1006 13:24:33.623392 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5\": container with ID starting with 7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5 not found: ID does not exist" containerID="7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.623417 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5"} err="failed to get container status \"7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5\": rpc error: code = NotFound desc = could not find container \"7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5\": container with ID starting with 7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5 not found: ID does not exist" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.623434 4867 scope.go:117] "RemoveContainer" containerID="47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.625977 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094"} err="failed to get container status \"47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094\": rpc error: code = NotFound desc = could not find container \"47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094\": container with ID starting with 47ed64e02e5b5792796048f22bdc84acfe310d520ff3856b49595fd8ae708094 not found: ID does not exist" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.626004 4867 scope.go:117] "RemoveContainer" containerID="7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.626093 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.627989 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5"} err="failed to get container status \"7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5\": rpc error: code = NotFound desc = could not find container \"7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5\": container with ID starting with 7f7cff9207019d6fabf05c94929544de39913afcdf01058b632b7eac29d73cb5 not found: ID does not exist" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.656753 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:33 crc kubenswrapper[4867]: E1006 13:24:33.660138 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a14623-32a3-4753-8626-4ffba880aced" containerName="init" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.660163 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a14623-32a3-4753-8626-4ffba880aced" containerName="init" Oct 06 13:24:33 crc kubenswrapper[4867]: E1006 13:24:33.660182 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a14623-32a3-4753-8626-4ffba880aced" containerName="dnsmasq-dns" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.660190 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a14623-32a3-4753-8626-4ffba880aced" containerName="dnsmasq-dns" Oct 06 13:24:33 crc kubenswrapper[4867]: E1006 13:24:33.660232 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6b405a-2a1a-4e35-b7e4-b5067e48fe18" containerName="nova-cell1-conductor-db-sync" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.660241 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6b405a-2a1a-4e35-b7e4-b5067e48fe18" containerName="nova-cell1-conductor-db-sync" Oct 06 13:24:33 crc kubenswrapper[4867]: E1006 13:24:33.660272 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="338211e8-4b3f-458f-a37c-812c46dca850" containerName="nova-metadata-log" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.660279 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="338211e8-4b3f-458f-a37c-812c46dca850" containerName="nova-metadata-log" Oct 06 13:24:33 crc kubenswrapper[4867]: E1006 13:24:33.660295 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0f139d-a521-4cc0-95cd-76e5f7f4d8de" containerName="nova-manage" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.660301 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0f139d-a521-4cc0-95cd-76e5f7f4d8de" containerName="nova-manage" Oct 06 13:24:33 crc kubenswrapper[4867]: E1006 13:24:33.660310 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="338211e8-4b3f-458f-a37c-812c46dca850" containerName="nova-metadata-metadata" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.660316 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="338211e8-4b3f-458f-a37c-812c46dca850" containerName="nova-metadata-metadata" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.660536 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="338211e8-4b3f-458f-a37c-812c46dca850" containerName="nova-metadata-metadata" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.660554 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="63a14623-32a3-4753-8626-4ffba880aced" containerName="dnsmasq-dns" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.660568 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e0f139d-a521-4cc0-95cd-76e5f7f4d8de" containerName="nova-manage" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.660581 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6b405a-2a1a-4e35-b7e4-b5067e48fe18" containerName="nova-cell1-conductor-db-sync" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.660590 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="338211e8-4b3f-458f-a37c-812c46dca850" containerName="nova-metadata-log" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.668641 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.671955 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.672150 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.681911 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.683512 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.687284 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.693718 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.709783 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.727232 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.727592 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac98dfd9-17f4-4911-83b7-ae865a97d33c-logs\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.727721 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-config-data\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.727807 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.727940 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh646\" (UniqueName: \"kubernetes.io/projected/ac98dfd9-17f4-4911-83b7-ae865a97d33c-kube-api-access-nh646\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.830996 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh646\" (UniqueName: \"kubernetes.io/projected/ac98dfd9-17f4-4911-83b7-ae865a97d33c-kube-api-access-nh646\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.831189 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3668fba3-af0f-478b-a41b-5de304592f65-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3668fba3-af0f-478b-a41b-5de304592f65\") " pod="openstack/nova-cell1-conductor-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.831272 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.831295 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac98dfd9-17f4-4911-83b7-ae865a97d33c-logs\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.831318 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3668fba3-af0f-478b-a41b-5de304592f65-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3668fba3-af0f-478b-a41b-5de304592f65\") " pod="openstack/nova-cell1-conductor-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.831369 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4n2l\" (UniqueName: \"kubernetes.io/projected/3668fba3-af0f-478b-a41b-5de304592f65-kube-api-access-l4n2l\") pod \"nova-cell1-conductor-0\" (UID: \"3668fba3-af0f-478b-a41b-5de304592f65\") " pod="openstack/nova-cell1-conductor-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.831410 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-config-data\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.831447 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.832187 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac98dfd9-17f4-4911-83b7-ae865a97d33c-logs\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.837964 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.838185 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-config-data\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.844102 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.850416 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh646\" (UniqueName: \"kubernetes.io/projected/ac98dfd9-17f4-4911-83b7-ae865a97d33c-kube-api-access-nh646\") pod \"nova-metadata-0\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " pod="openstack/nova-metadata-0" Oct 06 13:24:33 crc kubenswrapper[4867]: E1006 13:24:33.850525 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="914ddb1b5019d0b3a62f70986515467a6bea3defd63215405ff225633654660f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 13:24:33 crc kubenswrapper[4867]: E1006 13:24:33.854670 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="914ddb1b5019d0b3a62f70986515467a6bea3defd63215405ff225633654660f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 13:24:33 crc kubenswrapper[4867]: E1006 13:24:33.863151 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod338211e8_4b3f_458f_a37c_812c46dca850.slice\": RecentStats: unable to find data in memory cache]" Oct 06 13:24:33 crc kubenswrapper[4867]: E1006 13:24:33.916693 4867 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="914ddb1b5019d0b3a62f70986515467a6bea3defd63215405ff225633654660f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 13:24:33 crc kubenswrapper[4867]: E1006 13:24:33.916989 4867 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="74873575-9758-4c41-8cee-d7e71b25e9e3" containerName="nova-scheduler-scheduler" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.933398 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3668fba3-af0f-478b-a41b-5de304592f65-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3668fba3-af0f-478b-a41b-5de304592f65\") " pod="openstack/nova-cell1-conductor-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.933473 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3668fba3-af0f-478b-a41b-5de304592f65-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3668fba3-af0f-478b-a41b-5de304592f65\") " pod="openstack/nova-cell1-conductor-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.933513 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4n2l\" (UniqueName: \"kubernetes.io/projected/3668fba3-af0f-478b-a41b-5de304592f65-kube-api-access-l4n2l\") pod \"nova-cell1-conductor-0\" (UID: \"3668fba3-af0f-478b-a41b-5de304592f65\") " pod="openstack/nova-cell1-conductor-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.937308 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3668fba3-af0f-478b-a41b-5de304592f65-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3668fba3-af0f-478b-a41b-5de304592f65\") " pod="openstack/nova-cell1-conductor-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.938718 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3668fba3-af0f-478b-a41b-5de304592f65-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3668fba3-af0f-478b-a41b-5de304592f65\") " pod="openstack/nova-cell1-conductor-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.952010 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4n2l\" (UniqueName: \"kubernetes.io/projected/3668fba3-af0f-478b-a41b-5de304592f65-kube-api-access-l4n2l\") pod \"nova-cell1-conductor-0\" (UID: \"3668fba3-af0f-478b-a41b-5de304592f65\") " pod="openstack/nova-cell1-conductor-0" Oct 06 13:24:33 crc kubenswrapper[4867]: I1006 13:24:33.994527 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 13:24:34 crc kubenswrapper[4867]: I1006 13:24:34.008092 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 13:24:34 crc kubenswrapper[4867]: I1006 13:24:34.475435 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:24:34 crc kubenswrapper[4867]: I1006 13:24:34.490937 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 13:24:34 crc kubenswrapper[4867]: W1006 13:24:34.498139 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3668fba3_af0f_478b_a41b_5de304592f65.slice/crio-2535bcea6f15b116651751fd9f37a7cdbf0a3e52dcd638a639150a03da67d0ae WatchSource:0}: Error finding container 2535bcea6f15b116651751fd9f37a7cdbf0a3e52dcd638a639150a03da67d0ae: Status 404 returned error can't find the container with id 2535bcea6f15b116651751fd9f37a7cdbf0a3e52dcd638a639150a03da67d0ae Oct 06 13:24:34 crc kubenswrapper[4867]: I1006 13:24:34.550964 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3668fba3-af0f-478b-a41b-5de304592f65","Type":"ContainerStarted","Data":"2535bcea6f15b116651751fd9f37a7cdbf0a3e52dcd638a639150a03da67d0ae"} Oct 06 13:24:34 crc kubenswrapper[4867]: I1006 13:24:34.552979 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac98dfd9-17f4-4911-83b7-ae865a97d33c","Type":"ContainerStarted","Data":"55e5317e27f8e297b0a25e02f7f9009ec72e48f8333717d7282a6075da385cd7"} Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.251277 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="338211e8-4b3f-458f-a37c-812c46dca850" path="/var/lib/kubelet/pods/338211e8-4b3f-458f-a37c-812c46dca850/volumes" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.347070 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.379370 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw6rb\" (UniqueName: \"kubernetes.io/projected/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-kube-api-access-rw6rb\") pod \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.379463 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-config-data\") pod \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.379719 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-combined-ca-bundle\") pod \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.379846 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-logs\") pod \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\" (UID: \"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4\") " Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.386290 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-kube-api-access-rw6rb" (OuterVolumeSpecName: "kube-api-access-rw6rb") pod "6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" (UID: "6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4"). InnerVolumeSpecName "kube-api-access-rw6rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.388852 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-logs" (OuterVolumeSpecName: "logs") pod "6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" (UID: "6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.410431 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-config-data" (OuterVolumeSpecName: "config-data") pod "6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" (UID: "6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.416219 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" (UID: "6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.491997 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.492040 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.492052 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw6rb\" (UniqueName: \"kubernetes.io/projected/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-kube-api-access-rw6rb\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.492061 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.566776 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac98dfd9-17f4-4911-83b7-ae865a97d33c","Type":"ContainerStarted","Data":"7771e21c68516537056070cb11c13c31f218e61911593ab9c208bebf4530ad64"} Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.566825 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac98dfd9-17f4-4911-83b7-ae865a97d33c","Type":"ContainerStarted","Data":"bec26e23eeebd5020d214d42fa0ce2e720c9b00656223256fbec8e7e45393d26"} Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.569547 4867 generic.go:334] "Generic (PLEG): container finished" podID="6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" containerID="276f463fb21099bff23b1db0e49af7d1f3734f133ddaa600979313edafb3ea02" exitCode=0 Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.569590 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4","Type":"ContainerDied","Data":"276f463fb21099bff23b1db0e49af7d1f3734f133ddaa600979313edafb3ea02"} Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.569607 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4","Type":"ContainerDied","Data":"45886262c011a119ed5994e577c47b43fef3cd025c46015c59560e72b5998471"} Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.569624 4867 scope.go:117] "RemoveContainer" containerID="276f463fb21099bff23b1db0e49af7d1f3734f133ddaa600979313edafb3ea02" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.569730 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.580503 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3668fba3-af0f-478b-a41b-5de304592f65","Type":"ContainerStarted","Data":"1e0641037d82c505e1d02aef2add6432200d044dda5d194032c124f498809684"} Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.581166 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.604964 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6049435880000003 podStartE2EDuration="2.604943588s" podCreationTimestamp="2025-10-06 13:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:24:35.600667271 +0000 UTC m=+1255.058615425" watchObservedRunningTime="2025-10-06 13:24:35.604943588 +0000 UTC m=+1255.062891732" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.615490 4867 scope.go:117] "RemoveContainer" containerID="16f0ff0d37051f2f331c686beb7effecded08e76f00222269d1c8b03f7b26a70" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.631996 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.631971117 podStartE2EDuration="2.631971117s" podCreationTimestamp="2025-10-06 13:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:24:35.622665743 +0000 UTC m=+1255.080613877" watchObservedRunningTime="2025-10-06 13:24:35.631971117 +0000 UTC m=+1255.089919261" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.647362 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.657286 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.675033 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 13:24:35 crc kubenswrapper[4867]: E1006 13:24:35.675646 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" containerName="nova-api-log" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.675669 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" containerName="nova-api-log" Oct 06 13:24:35 crc kubenswrapper[4867]: E1006 13:24:35.675697 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" containerName="nova-api-api" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.675706 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" containerName="nova-api-api" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.675932 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" containerName="nova-api-log" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.675951 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" containerName="nova-api-api" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.677351 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.680829 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.695666 4867 scope.go:117] "RemoveContainer" containerID="276f463fb21099bff23b1db0e49af7d1f3734f133ddaa600979313edafb3ea02" Oct 06 13:24:35 crc kubenswrapper[4867]: E1006 13:24:35.697489 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"276f463fb21099bff23b1db0e49af7d1f3734f133ddaa600979313edafb3ea02\": container with ID starting with 276f463fb21099bff23b1db0e49af7d1f3734f133ddaa600979313edafb3ea02 not found: ID does not exist" containerID="276f463fb21099bff23b1db0e49af7d1f3734f133ddaa600979313edafb3ea02" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.697633 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"276f463fb21099bff23b1db0e49af7d1f3734f133ddaa600979313edafb3ea02"} err="failed to get container status \"276f463fb21099bff23b1db0e49af7d1f3734f133ddaa600979313edafb3ea02\": rpc error: code = NotFound desc = could not find container \"276f463fb21099bff23b1db0e49af7d1f3734f133ddaa600979313edafb3ea02\": container with ID starting with 276f463fb21099bff23b1db0e49af7d1f3734f133ddaa600979313edafb3ea02 not found: ID does not exist" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.697738 4867 scope.go:117] "RemoveContainer" containerID="16f0ff0d37051f2f331c686beb7effecded08e76f00222269d1c8b03f7b26a70" Oct 06 13:24:35 crc kubenswrapper[4867]: E1006 13:24:35.698109 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16f0ff0d37051f2f331c686beb7effecded08e76f00222269d1c8b03f7b26a70\": container with ID starting with 16f0ff0d37051f2f331c686beb7effecded08e76f00222269d1c8b03f7b26a70 not found: ID does not exist" containerID="16f0ff0d37051f2f331c686beb7effecded08e76f00222269d1c8b03f7b26a70" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.698217 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16f0ff0d37051f2f331c686beb7effecded08e76f00222269d1c8b03f7b26a70"} err="failed to get container status \"16f0ff0d37051f2f331c686beb7effecded08e76f00222269d1c8b03f7b26a70\": rpc error: code = NotFound desc = could not find container \"16f0ff0d37051f2f331c686beb7effecded08e76f00222269d1c8b03f7b26a70\": container with ID starting with 16f0ff0d37051f2f331c686beb7effecded08e76f00222269d1c8b03f7b26a70 not found: ID does not exist" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.707068 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.811723 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9755686-2a73-47e4-a067-34ff8a92583a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " pod="openstack/nova-api-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.811916 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4pqt\" (UniqueName: \"kubernetes.io/projected/b9755686-2a73-47e4-a067-34ff8a92583a-kube-api-access-s4pqt\") pod \"nova-api-0\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " pod="openstack/nova-api-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.812015 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9755686-2a73-47e4-a067-34ff8a92583a-logs\") pod \"nova-api-0\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " pod="openstack/nova-api-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.812089 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9755686-2a73-47e4-a067-34ff8a92583a-config-data\") pod \"nova-api-0\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " pod="openstack/nova-api-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.914282 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4pqt\" (UniqueName: \"kubernetes.io/projected/b9755686-2a73-47e4-a067-34ff8a92583a-kube-api-access-s4pqt\") pod \"nova-api-0\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " pod="openstack/nova-api-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.914367 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9755686-2a73-47e4-a067-34ff8a92583a-logs\") pod \"nova-api-0\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " pod="openstack/nova-api-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.914440 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9755686-2a73-47e4-a067-34ff8a92583a-config-data\") pod \"nova-api-0\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " pod="openstack/nova-api-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.914492 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9755686-2a73-47e4-a067-34ff8a92583a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " pod="openstack/nova-api-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.915344 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9755686-2a73-47e4-a067-34ff8a92583a-logs\") pod \"nova-api-0\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " pod="openstack/nova-api-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.919865 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9755686-2a73-47e4-a067-34ff8a92583a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " pod="openstack/nova-api-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.929396 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9755686-2a73-47e4-a067-34ff8a92583a-config-data\") pod \"nova-api-0\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " pod="openstack/nova-api-0" Oct 06 13:24:35 crc kubenswrapper[4867]: I1006 13:24:35.930583 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4pqt\" (UniqueName: \"kubernetes.io/projected/b9755686-2a73-47e4-a067-34ff8a92583a-kube-api-access-s4pqt\") pod \"nova-api-0\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " pod="openstack/nova-api-0" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.016239 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.601099 4867 generic.go:334] "Generic (PLEG): container finished" podID="74873575-9758-4c41-8cee-d7e71b25e9e3" containerID="914ddb1b5019d0b3a62f70986515467a6bea3defd63215405ff225633654660f" exitCode=0 Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.601341 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"74873575-9758-4c41-8cee-d7e71b25e9e3","Type":"ContainerDied","Data":"914ddb1b5019d0b3a62f70986515467a6bea3defd63215405ff225633654660f"} Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.607623 4867 generic.go:334] "Generic (PLEG): container finished" podID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerID="e0a580e6a0daebb9815f50e20686e3bb7251077cab2afa1f34f86b0aa99ae106" exitCode=0 Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.607697 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64fb7edb-92e9-4b44-b05a-667d470f7b10","Type":"ContainerDied","Data":"e0a580e6a0daebb9815f50e20686e3bb7251077cab2afa1f34f86b0aa99ae106"} Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.735840 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.742086 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:24:36 crc kubenswrapper[4867]: W1006 13:24:36.748688 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9755686_2a73_47e4_a067_34ff8a92583a.slice/crio-c3e026deed6e0e61a878a73e374f1e6bbd48ccd6a3b9ec4295b453087426faa8 WatchSource:0}: Error finding container c3e026deed6e0e61a878a73e374f1e6bbd48ccd6a3b9ec4295b453087426faa8: Status 404 returned error can't find the container with id c3e026deed6e0e61a878a73e374f1e6bbd48ccd6a3b9ec4295b453087426faa8 Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.799312 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.842365 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64fb7edb-92e9-4b44-b05a-667d470f7b10-log-httpd\") pod \"64fb7edb-92e9-4b44-b05a-667d470f7b10\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.842473 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwlsf\" (UniqueName: \"kubernetes.io/projected/64fb7edb-92e9-4b44-b05a-667d470f7b10-kube-api-access-nwlsf\") pod \"64fb7edb-92e9-4b44-b05a-667d470f7b10\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.842535 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-sg-core-conf-yaml\") pod \"64fb7edb-92e9-4b44-b05a-667d470f7b10\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.842633 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-config-data\") pod \"64fb7edb-92e9-4b44-b05a-667d470f7b10\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.843202 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-combined-ca-bundle\") pod \"64fb7edb-92e9-4b44-b05a-667d470f7b10\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.843341 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-scripts\") pod \"64fb7edb-92e9-4b44-b05a-667d470f7b10\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.843525 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64fb7edb-92e9-4b44-b05a-667d470f7b10-run-httpd\") pod \"64fb7edb-92e9-4b44-b05a-667d470f7b10\" (UID: \"64fb7edb-92e9-4b44-b05a-667d470f7b10\") " Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.844673 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64fb7edb-92e9-4b44-b05a-667d470f7b10-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "64fb7edb-92e9-4b44-b05a-667d470f7b10" (UID: "64fb7edb-92e9-4b44-b05a-667d470f7b10"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.845457 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64fb7edb-92e9-4b44-b05a-667d470f7b10-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "64fb7edb-92e9-4b44-b05a-667d470f7b10" (UID: "64fb7edb-92e9-4b44-b05a-667d470f7b10"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.852327 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-scripts" (OuterVolumeSpecName: "scripts") pod "64fb7edb-92e9-4b44-b05a-667d470f7b10" (UID: "64fb7edb-92e9-4b44-b05a-667d470f7b10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.852424 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64fb7edb-92e9-4b44-b05a-667d470f7b10-kube-api-access-nwlsf" (OuterVolumeSpecName: "kube-api-access-nwlsf") pod "64fb7edb-92e9-4b44-b05a-667d470f7b10" (UID: "64fb7edb-92e9-4b44-b05a-667d470f7b10"). InnerVolumeSpecName "kube-api-access-nwlsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.891878 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "64fb7edb-92e9-4b44-b05a-667d470f7b10" (UID: "64fb7edb-92e9-4b44-b05a-667d470f7b10"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.945774 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74873575-9758-4c41-8cee-d7e71b25e9e3-config-data\") pod \"74873575-9758-4c41-8cee-d7e71b25e9e3\" (UID: \"74873575-9758-4c41-8cee-d7e71b25e9e3\") " Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.946110 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnpmc\" (UniqueName: \"kubernetes.io/projected/74873575-9758-4c41-8cee-d7e71b25e9e3-kube-api-access-dnpmc\") pod \"74873575-9758-4c41-8cee-d7e71b25e9e3\" (UID: \"74873575-9758-4c41-8cee-d7e71b25e9e3\") " Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.946205 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74873575-9758-4c41-8cee-d7e71b25e9e3-combined-ca-bundle\") pod \"74873575-9758-4c41-8cee-d7e71b25e9e3\" (UID: \"74873575-9758-4c41-8cee-d7e71b25e9e3\") " Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.947057 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.947085 4867 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64fb7edb-92e9-4b44-b05a-667d470f7b10-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.947146 4867 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64fb7edb-92e9-4b44-b05a-667d470f7b10-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.947160 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwlsf\" (UniqueName: \"kubernetes.io/projected/64fb7edb-92e9-4b44-b05a-667d470f7b10-kube-api-access-nwlsf\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.947175 4867 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.949277 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74873575-9758-4c41-8cee-d7e71b25e9e3-kube-api-access-dnpmc" (OuterVolumeSpecName: "kube-api-access-dnpmc") pod "74873575-9758-4c41-8cee-d7e71b25e9e3" (UID: "74873575-9758-4c41-8cee-d7e71b25e9e3"). InnerVolumeSpecName "kube-api-access-dnpmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.949783 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64fb7edb-92e9-4b44-b05a-667d470f7b10" (UID: "64fb7edb-92e9-4b44-b05a-667d470f7b10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.994624 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74873575-9758-4c41-8cee-d7e71b25e9e3-config-data" (OuterVolumeSpecName: "config-data") pod "74873575-9758-4c41-8cee-d7e71b25e9e3" (UID: "74873575-9758-4c41-8cee-d7e71b25e9e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:36 crc kubenswrapper[4867]: I1006 13:24:36.994731 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74873575-9758-4c41-8cee-d7e71b25e9e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74873575-9758-4c41-8cee-d7e71b25e9e3" (UID: "74873575-9758-4c41-8cee-d7e71b25e9e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.000933 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-config-data" (OuterVolumeSpecName: "config-data") pod "64fb7edb-92e9-4b44-b05a-667d470f7b10" (UID: "64fb7edb-92e9-4b44-b05a-667d470f7b10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.050040 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74873575-9758-4c41-8cee-d7e71b25e9e3-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.050086 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnpmc\" (UniqueName: \"kubernetes.io/projected/74873575-9758-4c41-8cee-d7e71b25e9e3-kube-api-access-dnpmc\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.050096 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.050105 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74873575-9758-4c41-8cee-d7e71b25e9e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.050116 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64fb7edb-92e9-4b44-b05a-667d470f7b10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.239868 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4" path="/var/lib/kubelet/pods/6a06eefd-3d2b-433b-bd3a-5a30c66ef8f4/volumes" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.624000 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64fb7edb-92e9-4b44-b05a-667d470f7b10","Type":"ContainerDied","Data":"c9923a040510e66161d89264474d37ad0b934e7ef1fceceb725a7a624d1faab3"} Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.624055 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.624103 4867 scope.go:117] "RemoveContainer" containerID="cdfe0ab8b58aa25595ad138ac85533d85ff017b824419a1c68e14ab1dc9bbf02" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.627208 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"74873575-9758-4c41-8cee-d7e71b25e9e3","Type":"ContainerDied","Data":"b1ead17fe57880bc801fab2fd6f77757b64021ae64494d8e7496972be48a2504"} Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.627374 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.629596 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9755686-2a73-47e4-a067-34ff8a92583a","Type":"ContainerStarted","Data":"b7d841e7d7b2c6607f2452e63860186573a1a079fb434836ce47497c51892a63"} Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.629631 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9755686-2a73-47e4-a067-34ff8a92583a","Type":"ContainerStarted","Data":"34b138e26bceb83b9a5cc5d285b976cf2b610284dc481d41ce9d6a4be7a9c615"} Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.629646 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9755686-2a73-47e4-a067-34ff8a92583a","Type":"ContainerStarted","Data":"c3e026deed6e0e61a878a73e374f1e6bbd48ccd6a3b9ec4295b453087426faa8"} Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.658886 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.658863318 podStartE2EDuration="2.658863318s" podCreationTimestamp="2025-10-06 13:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:24:37.646769737 +0000 UTC m=+1257.104717881" watchObservedRunningTime="2025-10-06 13:24:37.658863318 +0000 UTC m=+1257.116811452" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.768459 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.786479 4867 scope.go:117] "RemoveContainer" containerID="1ed25887d16f76930f9b625495510144e00929118f3d31c30806f582ffd83626" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.794804 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.818614 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:24:37 crc kubenswrapper[4867]: E1006 13:24:37.819331 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74873575-9758-4c41-8cee-d7e71b25e9e3" containerName="nova-scheduler-scheduler" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.819357 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="74873575-9758-4c41-8cee-d7e71b25e9e3" containerName="nova-scheduler-scheduler" Oct 06 13:24:37 crc kubenswrapper[4867]: E1006 13:24:37.819381 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="ceilometer-central-agent" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.819389 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="ceilometer-central-agent" Oct 06 13:24:37 crc kubenswrapper[4867]: E1006 13:24:37.819461 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="sg-core" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.819471 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="sg-core" Oct 06 13:24:37 crc kubenswrapper[4867]: E1006 13:24:37.819495 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="ceilometer-notification-agent" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.819505 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="ceilometer-notification-agent" Oct 06 13:24:37 crc kubenswrapper[4867]: E1006 13:24:37.819531 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="proxy-httpd" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.819540 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="proxy-httpd" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.819749 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="ceilometer-notification-agent" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.819762 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="proxy-httpd" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.819775 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="sg-core" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.819783 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="74873575-9758-4c41-8cee-d7e71b25e9e3" containerName="nova-scheduler-scheduler" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.819800 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" containerName="ceilometer-central-agent" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.820652 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.822894 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.832126 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.858709 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.863441 4867 scope.go:117] "RemoveContainer" containerID="e0a580e6a0daebb9815f50e20686e3bb7251077cab2afa1f34f86b0aa99ae106" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.878599 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258a8420-c9fc-4115-b575-687ed7d8bc2a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"258a8420-c9fc-4115-b575-687ed7d8bc2a\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.878724 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258a8420-c9fc-4115-b575-687ed7d8bc2a-config-data\") pod \"nova-scheduler-0\" (UID: \"258a8420-c9fc-4115-b575-687ed7d8bc2a\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.878863 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xrhj\" (UniqueName: \"kubernetes.io/projected/258a8420-c9fc-4115-b575-687ed7d8bc2a-kube-api-access-5xrhj\") pod \"nova-scheduler-0\" (UID: \"258a8420-c9fc-4115-b575-687ed7d8bc2a\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.881403 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.897371 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.899040 4867 scope.go:117] "RemoveContainer" containerID="c57edc48609e65fde5369175114434967c4882943e1979b777657bda75fe0bdb" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.901514 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.903647 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.904984 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.909535 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.911153 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.938083 4867 scope.go:117] "RemoveContainer" containerID="914ddb1b5019d0b3a62f70986515467a6bea3defd63215405ff225633654660f" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.980700 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-scripts\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.980748 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74202e3a-7749-43d1-80dc-84e60fb4fc24-log-httpd\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.980782 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74202e3a-7749-43d1-80dc-84e60fb4fc24-run-httpd\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.980817 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.980839 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258a8420-c9fc-4115-b575-687ed7d8bc2a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"258a8420-c9fc-4115-b575-687ed7d8bc2a\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.980887 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.980920 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258a8420-c9fc-4115-b575-687ed7d8bc2a-config-data\") pod \"nova-scheduler-0\" (UID: \"258a8420-c9fc-4115-b575-687ed7d8bc2a\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.980938 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.980975 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-config-data\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.981016 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xrhj\" (UniqueName: \"kubernetes.io/projected/258a8420-c9fc-4115-b575-687ed7d8bc2a-kube-api-access-5xrhj\") pod \"nova-scheduler-0\" (UID: \"258a8420-c9fc-4115-b575-687ed7d8bc2a\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.981038 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzdn2\" (UniqueName: \"kubernetes.io/projected/74202e3a-7749-43d1-80dc-84e60fb4fc24-kube-api-access-tzdn2\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.986495 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258a8420-c9fc-4115-b575-687ed7d8bc2a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"258a8420-c9fc-4115-b575-687ed7d8bc2a\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.986806 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258a8420-c9fc-4115-b575-687ed7d8bc2a-config-data\") pod \"nova-scheduler-0\" (UID: \"258a8420-c9fc-4115-b575-687ed7d8bc2a\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:37 crc kubenswrapper[4867]: I1006 13:24:37.999920 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xrhj\" (UniqueName: \"kubernetes.io/projected/258a8420-c9fc-4115-b575-687ed7d8bc2a-kube-api-access-5xrhj\") pod \"nova-scheduler-0\" (UID: \"258a8420-c9fc-4115-b575-687ed7d8bc2a\") " pod="openstack/nova-scheduler-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.082635 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-scripts\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.082682 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74202e3a-7749-43d1-80dc-84e60fb4fc24-log-httpd\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.082711 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74202e3a-7749-43d1-80dc-84e60fb4fc24-run-httpd\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.082749 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.082798 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.082819 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.082853 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-config-data\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.082897 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzdn2\" (UniqueName: \"kubernetes.io/projected/74202e3a-7749-43d1-80dc-84e60fb4fc24-kube-api-access-tzdn2\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.083504 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74202e3a-7749-43d1-80dc-84e60fb4fc24-log-httpd\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.083717 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74202e3a-7749-43d1-80dc-84e60fb4fc24-run-httpd\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.085976 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-scripts\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.090012 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.090124 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.090646 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.091631 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-config-data\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.103291 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzdn2\" (UniqueName: \"kubernetes.io/projected/74202e3a-7749-43d1-80dc-84e60fb4fc24-kube-api-access-tzdn2\") pod \"ceilometer-0\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.142652 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.230818 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.599364 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:24:38 crc kubenswrapper[4867]: W1006 13:24:38.602952 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod258a8420_c9fc_4115_b575_687ed7d8bc2a.slice/crio-844118a0c98e39417150df5c29c44079e7ebd877ec9ba8bc2a0f4b19a95a8f99 WatchSource:0}: Error finding container 844118a0c98e39417150df5c29c44079e7ebd877ec9ba8bc2a0f4b19a95a8f99: Status 404 returned error can't find the container with id 844118a0c98e39417150df5c29c44079e7ebd877ec9ba8bc2a0f4b19a95a8f99 Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.644580 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"258a8420-c9fc-4115-b575-687ed7d8bc2a","Type":"ContainerStarted","Data":"844118a0c98e39417150df5c29c44079e7ebd877ec9ba8bc2a0f4b19a95a8f99"} Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.706548 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.998390 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 13:24:38 crc kubenswrapper[4867]: I1006 13:24:38.998433 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 13:24:39 crc kubenswrapper[4867]: I1006 13:24:39.041593 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 06 13:24:39 crc kubenswrapper[4867]: I1006 13:24:39.241768 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64fb7edb-92e9-4b44-b05a-667d470f7b10" path="/var/lib/kubelet/pods/64fb7edb-92e9-4b44-b05a-667d470f7b10/volumes" Oct 06 13:24:39 crc kubenswrapper[4867]: I1006 13:24:39.243120 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74873575-9758-4c41-8cee-d7e71b25e9e3" path="/var/lib/kubelet/pods/74873575-9758-4c41-8cee-d7e71b25e9e3/volumes" Oct 06 13:24:39 crc kubenswrapper[4867]: I1006 13:24:39.672894 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74202e3a-7749-43d1-80dc-84e60fb4fc24","Type":"ContainerStarted","Data":"cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1"} Oct 06 13:24:39 crc kubenswrapper[4867]: I1006 13:24:39.672954 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74202e3a-7749-43d1-80dc-84e60fb4fc24","Type":"ContainerStarted","Data":"db3cb63343fdc753afb31a24cc40528b505b59c7429ba4ef904e8eda85390235"} Oct 06 13:24:39 crc kubenswrapper[4867]: I1006 13:24:39.672975 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74202e3a-7749-43d1-80dc-84e60fb4fc24","Type":"ContainerStarted","Data":"0b7f20c0888f71745379fa8508d470a198f5ba855baf44640b9a15b633dc521d"} Oct 06 13:24:39 crc kubenswrapper[4867]: I1006 13:24:39.675979 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"258a8420-c9fc-4115-b575-687ed7d8bc2a","Type":"ContainerStarted","Data":"bb3e7b5a269e6d21fabdd2390579caa216cd3ac4ee06aa1899194dfa088bb840"} Oct 06 13:24:39 crc kubenswrapper[4867]: I1006 13:24:39.701514 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.701489831 podStartE2EDuration="2.701489831s" podCreationTimestamp="2025-10-06 13:24:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:24:39.699391974 +0000 UTC m=+1259.157340118" watchObservedRunningTime="2025-10-06 13:24:39.701489831 +0000 UTC m=+1259.159437975" Oct 06 13:24:39 crc kubenswrapper[4867]: I1006 13:24:39.927380 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 13:24:40 crc kubenswrapper[4867]: I1006 13:24:40.698809 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74202e3a-7749-43d1-80dc-84e60fb4fc24","Type":"ContainerStarted","Data":"8e306d1340412566fa929ec6d46c5e3273bab43b35db0a0d5d9667de74df669a"} Oct 06 13:24:41 crc kubenswrapper[4867]: I1006 13:24:41.712718 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74202e3a-7749-43d1-80dc-84e60fb4fc24","Type":"ContainerStarted","Data":"b14bf441468d511d268b79894e9bbdb6c3f6e96695a5e58c0e4247ca79444f4f"} Oct 06 13:24:41 crc kubenswrapper[4867]: I1006 13:24:41.714757 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 13:24:41 crc kubenswrapper[4867]: I1006 13:24:41.747390 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.141775836 podStartE2EDuration="4.747374151s" podCreationTimestamp="2025-10-06 13:24:37 +0000 UTC" firstStartedPulling="2025-10-06 13:24:38.718181534 +0000 UTC m=+1258.176129678" lastFinishedPulling="2025-10-06 13:24:41.323779849 +0000 UTC m=+1260.781727993" observedRunningTime="2025-10-06 13:24:41.746657532 +0000 UTC m=+1261.204605676" watchObservedRunningTime="2025-10-06 13:24:41.747374151 +0000 UTC m=+1261.205322295" Oct 06 13:24:42 crc kubenswrapper[4867]: I1006 13:24:42.873492 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:24:42 crc kubenswrapper[4867]: I1006 13:24:42.874047 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:24:42 crc kubenswrapper[4867]: I1006 13:24:42.874137 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:24:42 crc kubenswrapper[4867]: I1006 13:24:42.874968 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"266184608e50b4d6729b714d56d0cdb437a575eeec8c7e5f18126b05fc5a103e"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:24:42 crc kubenswrapper[4867]: I1006 13:24:42.875039 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://266184608e50b4d6729b714d56d0cdb437a575eeec8c7e5f18126b05fc5a103e" gracePeriod=600 Oct 06 13:24:43 crc kubenswrapper[4867]: I1006 13:24:43.143416 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 13:24:43 crc kubenswrapper[4867]: I1006 13:24:43.740688 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="266184608e50b4d6729b714d56d0cdb437a575eeec8c7e5f18126b05fc5a103e" exitCode=0 Oct 06 13:24:43 crc kubenswrapper[4867]: I1006 13:24:43.742423 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"266184608e50b4d6729b714d56d0cdb437a575eeec8c7e5f18126b05fc5a103e"} Oct 06 13:24:43 crc kubenswrapper[4867]: I1006 13:24:43.742466 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"708d16f9a6115595b008bafc5ad0e6ec3528bd438b87bd249255c174238bf7ec"} Oct 06 13:24:43 crc kubenswrapper[4867]: I1006 13:24:43.742487 4867 scope.go:117] "RemoveContainer" containerID="0a46508d237859c347210237945b8f376811db88e9f318300207a6c9aaeafb5d" Oct 06 13:24:43 crc kubenswrapper[4867]: I1006 13:24:43.994914 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 13:24:43 crc kubenswrapper[4867]: I1006 13:24:43.995457 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 13:24:45 crc kubenswrapper[4867]: I1006 13:24:45.009453 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ac98dfd9-17f4-4911-83b7-ae865a97d33c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 13:24:45 crc kubenswrapper[4867]: I1006 13:24:45.009472 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ac98dfd9-17f4-4911-83b7-ae865a97d33c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 13:24:46 crc kubenswrapper[4867]: I1006 13:24:46.017026 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 13:24:46 crc kubenswrapper[4867]: I1006 13:24:46.017347 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 13:24:47 crc kubenswrapper[4867]: I1006 13:24:47.099605 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b9755686-2a73-47e4-a067-34ff8a92583a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 13:24:47 crc kubenswrapper[4867]: I1006 13:24:47.099631 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b9755686-2a73-47e4-a067-34ff8a92583a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 13:24:48 crc kubenswrapper[4867]: I1006 13:24:48.142990 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 13:24:48 crc kubenswrapper[4867]: I1006 13:24:48.172828 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 13:24:48 crc kubenswrapper[4867]: I1006 13:24:48.843113 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:53.999890 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.001578 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.006676 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 13:24:54 crc kubenswrapper[4867]: E1006 13:24:54.440055 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3d871ba_46f8_465c_8bfe_2fb22ebc9c30.slice/crio-6c84ee90f9e00d46d5d07b2ce85650a9e11fb79a3a15b4beff20896f8fb783ae.scope\": RecentStats: unable to find data in memory cache]" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.785761 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.859643 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjszs\" (UniqueName: \"kubernetes.io/projected/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-kube-api-access-vjszs\") pod \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\" (UID: \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\") " Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.859696 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-config-data\") pod \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\" (UID: \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\") " Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.859815 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-combined-ca-bundle\") pod \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\" (UID: \"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30\") " Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.867223 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-kube-api-access-vjszs" (OuterVolumeSpecName: "kube-api-access-vjszs") pod "d3d871ba-46f8-465c-8bfe-2fb22ebc9c30" (UID: "d3d871ba-46f8-465c-8bfe-2fb22ebc9c30"). InnerVolumeSpecName "kube-api-access-vjszs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.881857 4867 generic.go:334] "Generic (PLEG): container finished" podID="d3d871ba-46f8-465c-8bfe-2fb22ebc9c30" containerID="6c84ee90f9e00d46d5d07b2ce85650a9e11fb79a3a15b4beff20896f8fb783ae" exitCode=137 Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.881896 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.881936 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30","Type":"ContainerDied","Data":"6c84ee90f9e00d46d5d07b2ce85650a9e11fb79a3a15b4beff20896f8fb783ae"} Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.882184 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d3d871ba-46f8-465c-8bfe-2fb22ebc9c30","Type":"ContainerDied","Data":"77f9d5df6f74646c4f7fd1634e8b45dc7616f8bf1a170fd7ce666f21a7cbad85"} Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.882216 4867 scope.go:117] "RemoveContainer" containerID="6c84ee90f9e00d46d5d07b2ce85650a9e11fb79a3a15b4beff20896f8fb783ae" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.893627 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-config-data" (OuterVolumeSpecName: "config-data") pod "d3d871ba-46f8-465c-8bfe-2fb22ebc9c30" (UID: "d3d871ba-46f8-465c-8bfe-2fb22ebc9c30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.895081 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.899738 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3d871ba-46f8-465c-8bfe-2fb22ebc9c30" (UID: "d3d871ba-46f8-465c-8bfe-2fb22ebc9c30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.962047 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.962078 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjszs\" (UniqueName: \"kubernetes.io/projected/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-kube-api-access-vjszs\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.962089 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.997544 4867 scope.go:117] "RemoveContainer" containerID="6c84ee90f9e00d46d5d07b2ce85650a9e11fb79a3a15b4beff20896f8fb783ae" Oct 06 13:24:54 crc kubenswrapper[4867]: E1006 13:24:54.998033 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c84ee90f9e00d46d5d07b2ce85650a9e11fb79a3a15b4beff20896f8fb783ae\": container with ID starting with 6c84ee90f9e00d46d5d07b2ce85650a9e11fb79a3a15b4beff20896f8fb783ae not found: ID does not exist" containerID="6c84ee90f9e00d46d5d07b2ce85650a9e11fb79a3a15b4beff20896f8fb783ae" Oct 06 13:24:54 crc kubenswrapper[4867]: I1006 13:24:54.998077 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c84ee90f9e00d46d5d07b2ce85650a9e11fb79a3a15b4beff20896f8fb783ae"} err="failed to get container status \"6c84ee90f9e00d46d5d07b2ce85650a9e11fb79a3a15b4beff20896f8fb783ae\": rpc error: code = NotFound desc = could not find container \"6c84ee90f9e00d46d5d07b2ce85650a9e11fb79a3a15b4beff20896f8fb783ae\": container with ID starting with 6c84ee90f9e00d46d5d07b2ce85650a9e11fb79a3a15b4beff20896f8fb783ae not found: ID does not exist" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.251707 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.261225 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.291456 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 13:24:55 crc kubenswrapper[4867]: E1006 13:24:55.291982 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d871ba-46f8-465c-8bfe-2fb22ebc9c30" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.292002 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d871ba-46f8-465c-8bfe-2fb22ebc9c30" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.292183 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d871ba-46f8-465c-8bfe-2fb22ebc9c30" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.293471 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.295961 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.296098 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.296165 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.299879 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.373461 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e2bbc3-d543-4521-bc10-88635228f1a9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.373561 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e2bbc3-d543-4521-bc10-88635228f1a9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.373877 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/38e2bbc3-d543-4521-bc10-88635228f1a9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.373995 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2ghh\" (UniqueName: \"kubernetes.io/projected/38e2bbc3-d543-4521-bc10-88635228f1a9-kube-api-access-b2ghh\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.374229 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/38e2bbc3-d543-4521-bc10-88635228f1a9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.476409 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e2bbc3-d543-4521-bc10-88635228f1a9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.476513 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e2bbc3-d543-4521-bc10-88635228f1a9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.476592 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/38e2bbc3-d543-4521-bc10-88635228f1a9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.476630 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2ghh\" (UniqueName: \"kubernetes.io/projected/38e2bbc3-d543-4521-bc10-88635228f1a9-kube-api-access-b2ghh\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.476694 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/38e2bbc3-d543-4521-bc10-88635228f1a9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.481240 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38e2bbc3-d543-4521-bc10-88635228f1a9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.482105 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/38e2bbc3-d543-4521-bc10-88635228f1a9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.482595 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38e2bbc3-d543-4521-bc10-88635228f1a9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.484408 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/38e2bbc3-d543-4521-bc10-88635228f1a9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.499885 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2ghh\" (UniqueName: \"kubernetes.io/projected/38e2bbc3-d543-4521-bc10-88635228f1a9-kube-api-access-b2ghh\") pod \"nova-cell1-novncproxy-0\" (UID: \"38e2bbc3-d543-4521-bc10-88635228f1a9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:55 crc kubenswrapper[4867]: I1006 13:24:55.621180 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:24:56 crc kubenswrapper[4867]: I1006 13:24:56.023562 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 13:24:56 crc kubenswrapper[4867]: I1006 13:24:56.024109 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 13:24:56 crc kubenswrapper[4867]: I1006 13:24:56.024682 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 13:24:56 crc kubenswrapper[4867]: I1006 13:24:56.030077 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 13:24:56 crc kubenswrapper[4867]: I1006 13:24:56.080040 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 13:24:56 crc kubenswrapper[4867]: I1006 13:24:56.905605 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38e2bbc3-d543-4521-bc10-88635228f1a9","Type":"ContainerStarted","Data":"92b4c0fd2b6d41e6133192e1f1237565f88cf192ea08d12113717754f4dfbff9"} Oct 06 13:24:56 crc kubenswrapper[4867]: I1006 13:24:56.906017 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 13:24:56 crc kubenswrapper[4867]: I1006 13:24:56.906034 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38e2bbc3-d543-4521-bc10-88635228f1a9","Type":"ContainerStarted","Data":"97163dec6edaf5ab9c2172513526804ae1b2c0f19e9bd14e23605e48ef6fe263"} Oct 06 13:24:56 crc kubenswrapper[4867]: I1006 13:24:56.912999 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 13:24:56 crc kubenswrapper[4867]: I1006 13:24:56.932529 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.932511226 podStartE2EDuration="1.932511226s" podCreationTimestamp="2025-10-06 13:24:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:24:56.928558448 +0000 UTC m=+1276.386506602" watchObservedRunningTime="2025-10-06 13:24:56.932511226 +0000 UTC m=+1276.390459380" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.100178 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d7fff947c-95sph"] Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.103068 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.144303 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d7fff947c-95sph"] Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.213373 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbbhz\" (UniqueName: \"kubernetes.io/projected/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-kube-api-access-wbbhz\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.213438 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-dns-swift-storage-0\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.213499 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-dns-svc\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.213531 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-ovsdbserver-nb\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.213573 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-config\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.213650 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-ovsdbserver-sb\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.234031 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3d871ba-46f8-465c-8bfe-2fb22ebc9c30" path="/var/lib/kubelet/pods/d3d871ba-46f8-465c-8bfe-2fb22ebc9c30/volumes" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.316738 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-ovsdbserver-sb\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.316990 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbbhz\" (UniqueName: \"kubernetes.io/projected/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-kube-api-access-wbbhz\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.317117 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-dns-swift-storage-0\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.317385 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-dns-svc\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.317551 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-ovsdbserver-nb\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.317670 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-ovsdbserver-sb\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.317777 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-config\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.319129 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-dns-svc\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.319201 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-dns-swift-storage-0\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.319987 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-config\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.320155 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-ovsdbserver-nb\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.336286 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbbhz\" (UniqueName: \"kubernetes.io/projected/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-kube-api-access-wbbhz\") pod \"dnsmasq-dns-7d7fff947c-95sph\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:57 crc kubenswrapper[4867]: I1006 13:24:57.460768 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:24:58 crc kubenswrapper[4867]: I1006 13:24:58.070242 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d7fff947c-95sph"] Oct 06 13:24:58 crc kubenswrapper[4867]: I1006 13:24:58.924226 4867 generic.go:334] "Generic (PLEG): container finished" podID="bb55b4ec-8008-40ec-922c-15dab4b1dcd6" containerID="b2ef4e326b88fe5ee81c9beba529dd727f3de468a2ff004a2a12ab3c09743f6a" exitCode=0 Oct 06 13:24:58 crc kubenswrapper[4867]: I1006 13:24:58.924541 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" event={"ID":"bb55b4ec-8008-40ec-922c-15dab4b1dcd6","Type":"ContainerDied","Data":"b2ef4e326b88fe5ee81c9beba529dd727f3de468a2ff004a2a12ab3c09743f6a"} Oct 06 13:24:58 crc kubenswrapper[4867]: I1006 13:24:58.925401 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" event={"ID":"bb55b4ec-8008-40ec-922c-15dab4b1dcd6","Type":"ContainerStarted","Data":"8a4a889426479d1f1d48b7f24ac54bef54f36168ef34b85bf913db4107f26ceb"} Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.404599 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.405283 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="ceilometer-central-agent" containerID="cri-o://db3cb63343fdc753afb31a24cc40528b505b59c7429ba4ef904e8eda85390235" gracePeriod=30 Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.405402 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="ceilometer-notification-agent" containerID="cri-o://cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1" gracePeriod=30 Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.405402 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="sg-core" containerID="cri-o://8e306d1340412566fa929ec6d46c5e3273bab43b35db0a0d5d9667de74df669a" gracePeriod=30 Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.405401 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="proxy-httpd" containerID="cri-o://b14bf441468d511d268b79894e9bbdb6c3f6e96695a5e58c0e4247ca79444f4f" gracePeriod=30 Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.416524 4867 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.219:3000/\": EOF" Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.821714 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.935620 4867 generic.go:334] "Generic (PLEG): container finished" podID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerID="b14bf441468d511d268b79894e9bbdb6c3f6e96695a5e58c0e4247ca79444f4f" exitCode=0 Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.935650 4867 generic.go:334] "Generic (PLEG): container finished" podID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerID="8e306d1340412566fa929ec6d46c5e3273bab43b35db0a0d5d9667de74df669a" exitCode=2 Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.935660 4867 generic.go:334] "Generic (PLEG): container finished" podID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerID="db3cb63343fdc753afb31a24cc40528b505b59c7429ba4ef904e8eda85390235" exitCode=0 Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.935692 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74202e3a-7749-43d1-80dc-84e60fb4fc24","Type":"ContainerDied","Data":"b14bf441468d511d268b79894e9bbdb6c3f6e96695a5e58c0e4247ca79444f4f"} Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.935740 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74202e3a-7749-43d1-80dc-84e60fb4fc24","Type":"ContainerDied","Data":"8e306d1340412566fa929ec6d46c5e3273bab43b35db0a0d5d9667de74df669a"} Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.935752 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74202e3a-7749-43d1-80dc-84e60fb4fc24","Type":"ContainerDied","Data":"db3cb63343fdc753afb31a24cc40528b505b59c7429ba4ef904e8eda85390235"} Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.938405 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" event={"ID":"bb55b4ec-8008-40ec-922c-15dab4b1dcd6","Type":"ContainerStarted","Data":"9f51ebb4cae8e38fa5fcdebce7b297cf8834b65a7ed5e6f0ec7ede21e4db0b00"} Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.938548 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b9755686-2a73-47e4-a067-34ff8a92583a" containerName="nova-api-log" containerID="cri-o://34b138e26bceb83b9a5cc5d285b976cf2b610284dc481d41ce9d6a4be7a9c615" gracePeriod=30 Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.938608 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b9755686-2a73-47e4-a067-34ff8a92583a" containerName="nova-api-api" containerID="cri-o://b7d841e7d7b2c6607f2452e63860186573a1a079fb434836ce47497c51892a63" gracePeriod=30 Oct 06 13:24:59 crc kubenswrapper[4867]: I1006 13:24:59.966614 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" podStartSLOduration=2.966595826 podStartE2EDuration="2.966595826s" podCreationTimestamp="2025-10-06 13:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:24:59.960176871 +0000 UTC m=+1279.418125015" watchObservedRunningTime="2025-10-06 13:24:59.966595826 +0000 UTC m=+1279.424543970" Oct 06 13:25:00 crc kubenswrapper[4867]: I1006 13:25:00.621749 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:25:00 crc kubenswrapper[4867]: I1006 13:25:00.949736 4867 generic.go:334] "Generic (PLEG): container finished" podID="b9755686-2a73-47e4-a067-34ff8a92583a" containerID="34b138e26bceb83b9a5cc5d285b976cf2b610284dc481d41ce9d6a4be7a9c615" exitCode=143 Oct 06 13:25:00 crc kubenswrapper[4867]: I1006 13:25:00.950813 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9755686-2a73-47e4-a067-34ff8a92583a","Type":"ContainerDied","Data":"34b138e26bceb83b9a5cc5d285b976cf2b610284dc481d41ce9d6a4be7a9c615"} Oct 06 13:25:00 crc kubenswrapper[4867]: I1006 13:25:00.950850 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:25:01 crc kubenswrapper[4867]: I1006 13:25:01.963696 4867 generic.go:334] "Generic (PLEG): container finished" podID="b9755686-2a73-47e4-a067-34ff8a92583a" containerID="b7d841e7d7b2c6607f2452e63860186573a1a079fb434836ce47497c51892a63" exitCode=0 Oct 06 13:25:01 crc kubenswrapper[4867]: I1006 13:25:01.963763 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9755686-2a73-47e4-a067-34ff8a92583a","Type":"ContainerDied","Data":"b7d841e7d7b2c6607f2452e63860186573a1a079fb434836ce47497c51892a63"} Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.058655 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.132650 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9755686-2a73-47e4-a067-34ff8a92583a-combined-ca-bundle\") pod \"b9755686-2a73-47e4-a067-34ff8a92583a\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.132756 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9755686-2a73-47e4-a067-34ff8a92583a-config-data\") pod \"b9755686-2a73-47e4-a067-34ff8a92583a\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.132953 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4pqt\" (UniqueName: \"kubernetes.io/projected/b9755686-2a73-47e4-a067-34ff8a92583a-kube-api-access-s4pqt\") pod \"b9755686-2a73-47e4-a067-34ff8a92583a\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.133023 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9755686-2a73-47e4-a067-34ff8a92583a-logs\") pod \"b9755686-2a73-47e4-a067-34ff8a92583a\" (UID: \"b9755686-2a73-47e4-a067-34ff8a92583a\") " Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.133955 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9755686-2a73-47e4-a067-34ff8a92583a-logs" (OuterVolumeSpecName: "logs") pod "b9755686-2a73-47e4-a067-34ff8a92583a" (UID: "b9755686-2a73-47e4-a067-34ff8a92583a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.142399 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9755686-2a73-47e4-a067-34ff8a92583a-kube-api-access-s4pqt" (OuterVolumeSpecName: "kube-api-access-s4pqt") pod "b9755686-2a73-47e4-a067-34ff8a92583a" (UID: "b9755686-2a73-47e4-a067-34ff8a92583a"). InnerVolumeSpecName "kube-api-access-s4pqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.171598 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9755686-2a73-47e4-a067-34ff8a92583a-config-data" (OuterVolumeSpecName: "config-data") pod "b9755686-2a73-47e4-a067-34ff8a92583a" (UID: "b9755686-2a73-47e4-a067-34ff8a92583a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.195433 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9755686-2a73-47e4-a067-34ff8a92583a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9755686-2a73-47e4-a067-34ff8a92583a" (UID: "b9755686-2a73-47e4-a067-34ff8a92583a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.235775 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9755686-2a73-47e4-a067-34ff8a92583a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.235815 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4pqt\" (UniqueName: \"kubernetes.io/projected/b9755686-2a73-47e4-a067-34ff8a92583a-kube-api-access-s4pqt\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.235828 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9755686-2a73-47e4-a067-34ff8a92583a-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.235841 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9755686-2a73-47e4-a067-34ff8a92583a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.975605 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9755686-2a73-47e4-a067-34ff8a92583a","Type":"ContainerDied","Data":"c3e026deed6e0e61a878a73e374f1e6bbd48ccd6a3b9ec4295b453087426faa8"} Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.975677 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 13:25:02 crc kubenswrapper[4867]: I1006 13:25:02.975925 4867 scope.go:117] "RemoveContainer" containerID="b7d841e7d7b2c6607f2452e63860186573a1a079fb434836ce47497c51892a63" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.008540 4867 scope.go:117] "RemoveContainer" containerID="34b138e26bceb83b9a5cc5d285b976cf2b610284dc481d41ce9d6a4be7a9c615" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.026338 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.040891 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.059401 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 13:25:03 crc kubenswrapper[4867]: E1006 13:25:03.059849 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9755686-2a73-47e4-a067-34ff8a92583a" containerName="nova-api-log" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.059870 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9755686-2a73-47e4-a067-34ff8a92583a" containerName="nova-api-log" Oct 06 13:25:03 crc kubenswrapper[4867]: E1006 13:25:03.059885 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9755686-2a73-47e4-a067-34ff8a92583a" containerName="nova-api-api" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.059892 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9755686-2a73-47e4-a067-34ff8a92583a" containerName="nova-api-api" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.060201 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9755686-2a73-47e4-a067-34ff8a92583a" containerName="nova-api-api" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.060282 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9755686-2a73-47e4-a067-34ff8a92583a" containerName="nova-api-log" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.062848 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.064881 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.065112 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.065459 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.072137 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.156492 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.156556 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-public-tls-certs\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.156748 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.156852 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwctk\" (UniqueName: \"kubernetes.io/projected/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-kube-api-access-dwctk\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.156948 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-logs\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.157050 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-config-data\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.239485 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9755686-2a73-47e4-a067-34ff8a92583a" path="/var/lib/kubelet/pods/b9755686-2a73-47e4-a067-34ff8a92583a/volumes" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.259386 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-logs\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.259495 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-config-data\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.259532 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.259583 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-public-tls-certs\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.259698 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.259718 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwctk\" (UniqueName: \"kubernetes.io/projected/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-kube-api-access-dwctk\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.259950 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-logs\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.265090 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.266018 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-public-tls-certs\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.267102 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-config-data\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.274161 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.281322 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwctk\" (UniqueName: \"kubernetes.io/projected/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-kube-api-access-dwctk\") pod \"nova-api-0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " pod="openstack/nova-api-0" Oct 06 13:25:03 crc kubenswrapper[4867]: I1006 13:25:03.389745 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 13:25:04 crc kubenswrapper[4867]: I1006 13:25:03.901035 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:25:04 crc kubenswrapper[4867]: I1006 13:25:03.994191 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0","Type":"ContainerStarted","Data":"844f456527d59428924d09f6278879aa13d8aa18f3b7da682fdce8784dd4b325"} Oct 06 13:25:04 crc kubenswrapper[4867]: E1006 13:25:04.744651 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74202e3a_7749_43d1_80dc_84e60fb4fc24.slice/crio-cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74202e3a_7749_43d1_80dc_84e60fb4fc24.slice/crio-conmon-cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1.scope\": RecentStats: unable to find data in memory cache]" Oct 06 13:25:04 crc kubenswrapper[4867]: I1006 13:25:04.935345 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.001778 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-sg-core-conf-yaml\") pod \"74202e3a-7749-43d1-80dc-84e60fb4fc24\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.001891 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74202e3a-7749-43d1-80dc-84e60fb4fc24-log-httpd\") pod \"74202e3a-7749-43d1-80dc-84e60fb4fc24\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.002122 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-config-data\") pod \"74202e3a-7749-43d1-80dc-84e60fb4fc24\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.002179 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-ceilometer-tls-certs\") pod \"74202e3a-7749-43d1-80dc-84e60fb4fc24\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.002355 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-combined-ca-bundle\") pod \"74202e3a-7749-43d1-80dc-84e60fb4fc24\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.002571 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74202e3a-7749-43d1-80dc-84e60fb4fc24-run-httpd\") pod \"74202e3a-7749-43d1-80dc-84e60fb4fc24\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.002650 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-scripts\") pod \"74202e3a-7749-43d1-80dc-84e60fb4fc24\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.002737 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzdn2\" (UniqueName: \"kubernetes.io/projected/74202e3a-7749-43d1-80dc-84e60fb4fc24-kube-api-access-tzdn2\") pod \"74202e3a-7749-43d1-80dc-84e60fb4fc24\" (UID: \"74202e3a-7749-43d1-80dc-84e60fb4fc24\") " Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.002998 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74202e3a-7749-43d1-80dc-84e60fb4fc24-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "74202e3a-7749-43d1-80dc-84e60fb4fc24" (UID: "74202e3a-7749-43d1-80dc-84e60fb4fc24"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.003220 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74202e3a-7749-43d1-80dc-84e60fb4fc24-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "74202e3a-7749-43d1-80dc-84e60fb4fc24" (UID: "74202e3a-7749-43d1-80dc-84e60fb4fc24"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.005606 4867 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74202e3a-7749-43d1-80dc-84e60fb4fc24-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.005638 4867 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74202e3a-7749-43d1-80dc-84e60fb4fc24-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.015563 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74202e3a-7749-43d1-80dc-84e60fb4fc24-kube-api-access-tzdn2" (OuterVolumeSpecName: "kube-api-access-tzdn2") pod "74202e3a-7749-43d1-80dc-84e60fb4fc24" (UID: "74202e3a-7749-43d1-80dc-84e60fb4fc24"). InnerVolumeSpecName "kube-api-access-tzdn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.015816 4867 generic.go:334] "Generic (PLEG): container finished" podID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerID="cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1" exitCode=0 Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.015945 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74202e3a-7749-43d1-80dc-84e60fb4fc24","Type":"ContainerDied","Data":"cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1"} Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.015990 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.016021 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74202e3a-7749-43d1-80dc-84e60fb4fc24","Type":"ContainerDied","Data":"0b7f20c0888f71745379fa8508d470a198f5ba855baf44640b9a15b633dc521d"} Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.016127 4867 scope.go:117] "RemoveContainer" containerID="b14bf441468d511d268b79894e9bbdb6c3f6e96695a5e58c0e4247ca79444f4f" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.020884 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0","Type":"ContainerStarted","Data":"78c0bc7e6c48ed54a99d5cdd1ad13ad8a4d146c1126436340c6103f26b132d32"} Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.020960 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0","Type":"ContainerStarted","Data":"722008c23e4359d8da69118e55467fdca424b690a4d4576d86c718e72b537ac8"} Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.023989 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-scripts" (OuterVolumeSpecName: "scripts") pod "74202e3a-7749-43d1-80dc-84e60fb4fc24" (UID: "74202e3a-7749-43d1-80dc-84e60fb4fc24"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.036745 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "74202e3a-7749-43d1-80dc-84e60fb4fc24" (UID: "74202e3a-7749-43d1-80dc-84e60fb4fc24"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.048395 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.048366616 podStartE2EDuration="2.048366616s" podCreationTimestamp="2025-10-06 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:25:05.042046293 +0000 UTC m=+1284.499994447" watchObservedRunningTime="2025-10-06 13:25:05.048366616 +0000 UTC m=+1284.506314760" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.069446 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "74202e3a-7749-43d1-80dc-84e60fb4fc24" (UID: "74202e3a-7749-43d1-80dc-84e60fb4fc24"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.105396 4867 scope.go:117] "RemoveContainer" containerID="8e306d1340412566fa929ec6d46c5e3273bab43b35db0a0d5d9667de74df669a" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.106991 4867 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.107032 4867 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.107046 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.107222 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzdn2\" (UniqueName: \"kubernetes.io/projected/74202e3a-7749-43d1-80dc-84e60fb4fc24-kube-api-access-tzdn2\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.123367 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74202e3a-7749-43d1-80dc-84e60fb4fc24" (UID: "74202e3a-7749-43d1-80dc-84e60fb4fc24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.128336 4867 scope.go:117] "RemoveContainer" containerID="cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.147460 4867 scope.go:117] "RemoveContainer" containerID="db3cb63343fdc753afb31a24cc40528b505b59c7429ba4ef904e8eda85390235" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.149572 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-config-data" (OuterVolumeSpecName: "config-data") pod "74202e3a-7749-43d1-80dc-84e60fb4fc24" (UID: "74202e3a-7749-43d1-80dc-84e60fb4fc24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.166124 4867 scope.go:117] "RemoveContainer" containerID="b14bf441468d511d268b79894e9bbdb6c3f6e96695a5e58c0e4247ca79444f4f" Oct 06 13:25:05 crc kubenswrapper[4867]: E1006 13:25:05.166737 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b14bf441468d511d268b79894e9bbdb6c3f6e96695a5e58c0e4247ca79444f4f\": container with ID starting with b14bf441468d511d268b79894e9bbdb6c3f6e96695a5e58c0e4247ca79444f4f not found: ID does not exist" containerID="b14bf441468d511d268b79894e9bbdb6c3f6e96695a5e58c0e4247ca79444f4f" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.166814 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b14bf441468d511d268b79894e9bbdb6c3f6e96695a5e58c0e4247ca79444f4f"} err="failed to get container status \"b14bf441468d511d268b79894e9bbdb6c3f6e96695a5e58c0e4247ca79444f4f\": rpc error: code = NotFound desc = could not find container \"b14bf441468d511d268b79894e9bbdb6c3f6e96695a5e58c0e4247ca79444f4f\": container with ID starting with b14bf441468d511d268b79894e9bbdb6c3f6e96695a5e58c0e4247ca79444f4f not found: ID does not exist" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.166855 4867 scope.go:117] "RemoveContainer" containerID="8e306d1340412566fa929ec6d46c5e3273bab43b35db0a0d5d9667de74df669a" Oct 06 13:25:05 crc kubenswrapper[4867]: E1006 13:25:05.167323 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e306d1340412566fa929ec6d46c5e3273bab43b35db0a0d5d9667de74df669a\": container with ID starting with 8e306d1340412566fa929ec6d46c5e3273bab43b35db0a0d5d9667de74df669a not found: ID does not exist" containerID="8e306d1340412566fa929ec6d46c5e3273bab43b35db0a0d5d9667de74df669a" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.167404 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e306d1340412566fa929ec6d46c5e3273bab43b35db0a0d5d9667de74df669a"} err="failed to get container status \"8e306d1340412566fa929ec6d46c5e3273bab43b35db0a0d5d9667de74df669a\": rpc error: code = NotFound desc = could not find container \"8e306d1340412566fa929ec6d46c5e3273bab43b35db0a0d5d9667de74df669a\": container with ID starting with 8e306d1340412566fa929ec6d46c5e3273bab43b35db0a0d5d9667de74df669a not found: ID does not exist" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.167464 4867 scope.go:117] "RemoveContainer" containerID="cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1" Oct 06 13:25:05 crc kubenswrapper[4867]: E1006 13:25:05.167871 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1\": container with ID starting with cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1 not found: ID does not exist" containerID="cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.167903 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1"} err="failed to get container status \"cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1\": rpc error: code = NotFound desc = could not find container \"cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1\": container with ID starting with cfac670482bbccff7a4564ba273cd6271eca4580e8a596f2371b624756ef6bc1 not found: ID does not exist" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.167919 4867 scope.go:117] "RemoveContainer" containerID="db3cb63343fdc753afb31a24cc40528b505b59c7429ba4ef904e8eda85390235" Oct 06 13:25:05 crc kubenswrapper[4867]: E1006 13:25:05.168203 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db3cb63343fdc753afb31a24cc40528b505b59c7429ba4ef904e8eda85390235\": container with ID starting with db3cb63343fdc753afb31a24cc40528b505b59c7429ba4ef904e8eda85390235 not found: ID does not exist" containerID="db3cb63343fdc753afb31a24cc40528b505b59c7429ba4ef904e8eda85390235" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.168248 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3cb63343fdc753afb31a24cc40528b505b59c7429ba4ef904e8eda85390235"} err="failed to get container status \"db3cb63343fdc753afb31a24cc40528b505b59c7429ba4ef904e8eda85390235\": rpc error: code = NotFound desc = could not find container \"db3cb63343fdc753afb31a24cc40528b505b59c7429ba4ef904e8eda85390235\": container with ID starting with db3cb63343fdc753afb31a24cc40528b505b59c7429ba4ef904e8eda85390235 not found: ID does not exist" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.209893 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.210121 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74202e3a-7749-43d1-80dc-84e60fb4fc24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.369035 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.390095 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.405039 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:25:05 crc kubenswrapper[4867]: E1006 13:25:05.405619 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="ceilometer-central-agent" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.405640 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="ceilometer-central-agent" Oct 06 13:25:05 crc kubenswrapper[4867]: E1006 13:25:05.405679 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="ceilometer-notification-agent" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.405685 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="ceilometer-notification-agent" Oct 06 13:25:05 crc kubenswrapper[4867]: E1006 13:25:05.405696 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="sg-core" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.405702 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="sg-core" Oct 06 13:25:05 crc kubenswrapper[4867]: E1006 13:25:05.405714 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="proxy-httpd" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.405811 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="proxy-httpd" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.406011 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="sg-core" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.406034 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="ceilometer-notification-agent" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.406048 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="ceilometer-central-agent" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.406069 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" containerName="proxy-httpd" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.409128 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.413892 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.414396 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.414623 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.424014 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.515840 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-config-data\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.515903 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5383175c-d1e2-4f75-a9b6-5986e5062009-run-httpd\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.515947 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.515992 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-scripts\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.516018 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.516060 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.516077 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5383175c-d1e2-4f75-a9b6-5986e5062009-log-httpd\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.516113 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldzcp\" (UniqueName: \"kubernetes.io/projected/5383175c-d1e2-4f75-a9b6-5986e5062009-kube-api-access-ldzcp\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.617514 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.617557 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5383175c-d1e2-4f75-a9b6-5986e5062009-log-httpd\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.617626 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldzcp\" (UniqueName: \"kubernetes.io/projected/5383175c-d1e2-4f75-a9b6-5986e5062009-kube-api-access-ldzcp\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.617734 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-config-data\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.617767 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5383175c-d1e2-4f75-a9b6-5986e5062009-run-httpd\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.617790 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.617837 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-scripts\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.617861 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.618020 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5383175c-d1e2-4f75-a9b6-5986e5062009-log-httpd\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.618377 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5383175c-d1e2-4f75-a9b6-5986e5062009-run-httpd\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.622480 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.626802 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-config-data\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.626934 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.628444 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.630732 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-scripts\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.637076 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5383175c-d1e2-4f75-a9b6-5986e5062009-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.641738 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldzcp\" (UniqueName: \"kubernetes.io/projected/5383175c-d1e2-4f75-a9b6-5986e5062009-kube-api-access-ldzcp\") pod \"ceilometer-0\" (UID: \"5383175c-d1e2-4f75-a9b6-5986e5062009\") " pod="openstack/ceilometer-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.651679 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:25:05 crc kubenswrapper[4867]: I1006 13:25:05.732307 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.064761 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.196780 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.209106 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-jvnnk"] Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.210936 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.215507 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.215675 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.219862 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jvnnk"] Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.236783 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-config-data\") pod \"nova-cell1-cell-mapping-jvnnk\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.236892 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-scripts\") pod \"nova-cell1-cell-mapping-jvnnk\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.236927 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75tgt\" (UniqueName: \"kubernetes.io/projected/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-kube-api-access-75tgt\") pod \"nova-cell1-cell-mapping-jvnnk\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.236953 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jvnnk\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.338714 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-scripts\") pod \"nova-cell1-cell-mapping-jvnnk\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.338803 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75tgt\" (UniqueName: \"kubernetes.io/projected/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-kube-api-access-75tgt\") pod \"nova-cell1-cell-mapping-jvnnk\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.338845 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jvnnk\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.339019 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-config-data\") pod \"nova-cell1-cell-mapping-jvnnk\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.352022 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jvnnk\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.352042 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-scripts\") pod \"nova-cell1-cell-mapping-jvnnk\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.364942 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-config-data\") pod \"nova-cell1-cell-mapping-jvnnk\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.377521 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75tgt\" (UniqueName: \"kubernetes.io/projected/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-kube-api-access-75tgt\") pod \"nova-cell1-cell-mapping-jvnnk\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:06 crc kubenswrapper[4867]: I1006 13:25:06.635321 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:07 crc kubenswrapper[4867]: I1006 13:25:07.056563 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5383175c-d1e2-4f75-a9b6-5986e5062009","Type":"ContainerStarted","Data":"cdaa67402c9370e10fed2b22b97efb5f0c9fa4afa88e7e924126de206bee8423"} Oct 06 13:25:07 crc kubenswrapper[4867]: I1006 13:25:07.056947 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5383175c-d1e2-4f75-a9b6-5986e5062009","Type":"ContainerStarted","Data":"14374d497386c4f512f7b51e85b620f2c88a4260e1936e2a7d16b617bc6df3d2"} Oct 06 13:25:07 crc kubenswrapper[4867]: I1006 13:25:07.056961 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5383175c-d1e2-4f75-a9b6-5986e5062009","Type":"ContainerStarted","Data":"c913f60c550332b34f842aee52cfe82990525185b1545ed4b0bfeeed844bb65b"} Oct 06 13:25:07 crc kubenswrapper[4867]: I1006 13:25:07.111273 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jvnnk"] Oct 06 13:25:07 crc kubenswrapper[4867]: I1006 13:25:07.235242 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74202e3a-7749-43d1-80dc-84e60fb4fc24" path="/var/lib/kubelet/pods/74202e3a-7749-43d1-80dc-84e60fb4fc24/volumes" Oct 06 13:25:07 crc kubenswrapper[4867]: I1006 13:25:07.462441 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:25:07 crc kubenswrapper[4867]: I1006 13:25:07.529132 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d8d7c5479-6cc94"] Oct 06 13:25:07 crc kubenswrapper[4867]: I1006 13:25:07.529435 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" podUID="1e0330c8-1ee8-44f5-9b86-7c2f242c1294" containerName="dnsmasq-dns" containerID="cri-o://dea6c6f18f0394e7d92ff0808615df1a3ea310e6a7b5b7a995fdd0a3d720d39e" gracePeriod=10 Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.077454 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jvnnk" event={"ID":"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860","Type":"ContainerStarted","Data":"89a7d2fda9e6587b25239bebd4814fcae3f31f10adb9606221a28ace9295038f"} Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.077745 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jvnnk" event={"ID":"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860","Type":"ContainerStarted","Data":"616e8e82f193baa702d7eccc788621dcc326d21db251845ac41cdbeb266253c2"} Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.091347 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5383175c-d1e2-4f75-a9b6-5986e5062009","Type":"ContainerStarted","Data":"e35f7b42598be75726e07378a457a7fd2b84499a15d11f79bff1928e79dcc74a"} Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.102412 4867 generic.go:334] "Generic (PLEG): container finished" podID="1e0330c8-1ee8-44f5-9b86-7c2f242c1294" containerID="dea6c6f18f0394e7d92ff0808615df1a3ea310e6a7b5b7a995fdd0a3d720d39e" exitCode=0 Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.102470 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" event={"ID":"1e0330c8-1ee8-44f5-9b86-7c2f242c1294","Type":"ContainerDied","Data":"dea6c6f18f0394e7d92ff0808615df1a3ea310e6a7b5b7a995fdd0a3d720d39e"} Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.102498 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" event={"ID":"1e0330c8-1ee8-44f5-9b86-7c2f242c1294","Type":"ContainerDied","Data":"675a8217476c48fa3500b0f2724406b982c7e6aa8adee5b9cba423f79b511c0b"} Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.102509 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="675a8217476c48fa3500b0f2724406b982c7e6aa8adee5b9cba423f79b511c0b" Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.105610 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-jvnnk" podStartSLOduration=2.105591919 podStartE2EDuration="2.105591919s" podCreationTimestamp="2025-10-06 13:25:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:25:08.099302767 +0000 UTC m=+1287.557250911" watchObservedRunningTime="2025-10-06 13:25:08.105591919 +0000 UTC m=+1287.563540063" Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.135808 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.283193 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-dns-swift-storage-0\") pod \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.283367 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-dns-svc\") pod \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.283415 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-ovsdbserver-nb\") pod \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.283462 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-config\") pod \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.283524 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s62b5\" (UniqueName: \"kubernetes.io/projected/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-kube-api-access-s62b5\") pod \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.283592 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-ovsdbserver-sb\") pod \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\" (UID: \"1e0330c8-1ee8-44f5-9b86-7c2f242c1294\") " Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.289433 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-kube-api-access-s62b5" (OuterVolumeSpecName: "kube-api-access-s62b5") pod "1e0330c8-1ee8-44f5-9b86-7c2f242c1294" (UID: "1e0330c8-1ee8-44f5-9b86-7c2f242c1294"). InnerVolumeSpecName "kube-api-access-s62b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.348691 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1e0330c8-1ee8-44f5-9b86-7c2f242c1294" (UID: "1e0330c8-1ee8-44f5-9b86-7c2f242c1294"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.350832 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1e0330c8-1ee8-44f5-9b86-7c2f242c1294" (UID: "1e0330c8-1ee8-44f5-9b86-7c2f242c1294"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.356562 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1e0330c8-1ee8-44f5-9b86-7c2f242c1294" (UID: "1e0330c8-1ee8-44f5-9b86-7c2f242c1294"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.357908 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1e0330c8-1ee8-44f5-9b86-7c2f242c1294" (UID: "1e0330c8-1ee8-44f5-9b86-7c2f242c1294"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.361595 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-config" (OuterVolumeSpecName: "config") pod "1e0330c8-1ee8-44f5-9b86-7c2f242c1294" (UID: "1e0330c8-1ee8-44f5-9b86-7c2f242c1294"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.386423 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s62b5\" (UniqueName: \"kubernetes.io/projected/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-kube-api-access-s62b5\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.386601 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.386675 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.386728 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.386794 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:08 crc kubenswrapper[4867]: I1006 13:25:08.386875 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e0330c8-1ee8-44f5-9b86-7c2f242c1294-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:09 crc kubenswrapper[4867]: I1006 13:25:09.135273 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8d7c5479-6cc94" Oct 06 13:25:09 crc kubenswrapper[4867]: I1006 13:25:09.179275 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d8d7c5479-6cc94"] Oct 06 13:25:09 crc kubenswrapper[4867]: I1006 13:25:09.187270 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d8d7c5479-6cc94"] Oct 06 13:25:09 crc kubenswrapper[4867]: I1006 13:25:09.252286 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e0330c8-1ee8-44f5-9b86-7c2f242c1294" path="/var/lib/kubelet/pods/1e0330c8-1ee8-44f5-9b86-7c2f242c1294/volumes" Oct 06 13:25:10 crc kubenswrapper[4867]: I1006 13:25:10.150567 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5383175c-d1e2-4f75-a9b6-5986e5062009","Type":"ContainerStarted","Data":"98b757d788eca17af4536fce1aab52370e7f09b2d78c124a864e0ea63a5a640c"} Oct 06 13:25:10 crc kubenswrapper[4867]: I1006 13:25:10.151137 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 13:25:13 crc kubenswrapper[4867]: I1006 13:25:13.192206 4867 generic.go:334] "Generic (PLEG): container finished" podID="5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860" containerID="89a7d2fda9e6587b25239bebd4814fcae3f31f10adb9606221a28ace9295038f" exitCode=0 Oct 06 13:25:13 crc kubenswrapper[4867]: I1006 13:25:13.192384 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jvnnk" event={"ID":"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860","Type":"ContainerDied","Data":"89a7d2fda9e6587b25239bebd4814fcae3f31f10adb9606221a28ace9295038f"} Oct 06 13:25:13 crc kubenswrapper[4867]: I1006 13:25:13.218664 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.460643732 podStartE2EDuration="8.218638193s" podCreationTimestamp="2025-10-06 13:25:05 +0000 UTC" firstStartedPulling="2025-10-06 13:25:06.234401385 +0000 UTC m=+1285.692349529" lastFinishedPulling="2025-10-06 13:25:08.992395846 +0000 UTC m=+1288.450343990" observedRunningTime="2025-10-06 13:25:10.17418478 +0000 UTC m=+1289.632132914" watchObservedRunningTime="2025-10-06 13:25:13.218638193 +0000 UTC m=+1292.676586337" Oct 06 13:25:13 crc kubenswrapper[4867]: I1006 13:25:13.390416 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 13:25:13 crc kubenswrapper[4867]: I1006 13:25:13.390470 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.412470 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.412542 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.644986 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.823345 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-combined-ca-bundle\") pod \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.823462 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-scripts\") pod \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.823748 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-config-data\") pod \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.823816 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75tgt\" (UniqueName: \"kubernetes.io/projected/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-kube-api-access-75tgt\") pod \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\" (UID: \"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860\") " Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.830711 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-scripts" (OuterVolumeSpecName: "scripts") pod "5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860" (UID: "5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.831592 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-kube-api-access-75tgt" (OuterVolumeSpecName: "kube-api-access-75tgt") pod "5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860" (UID: "5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860"). InnerVolumeSpecName "kube-api-access-75tgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.857489 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860" (UID: "5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.864560 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-config-data" (OuterVolumeSpecName: "config-data") pod "5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860" (UID: "5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.926822 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.926856 4867 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.926866 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:14 crc kubenswrapper[4867]: I1006 13:25:14.926877 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75tgt\" (UniqueName: \"kubernetes.io/projected/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860-kube-api-access-75tgt\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:15 crc kubenswrapper[4867]: I1006 13:25:15.234737 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jvnnk" Oct 06 13:25:15 crc kubenswrapper[4867]: I1006 13:25:15.237643 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jvnnk" event={"ID":"5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860","Type":"ContainerDied","Data":"616e8e82f193baa702d7eccc788621dcc326d21db251845ac41cdbeb266253c2"} Oct 06 13:25:15 crc kubenswrapper[4867]: I1006 13:25:15.237701 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="616e8e82f193baa702d7eccc788621dcc326d21db251845ac41cdbeb266253c2" Oct 06 13:25:15 crc kubenswrapper[4867]: I1006 13:25:15.410243 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:25:15 crc kubenswrapper[4867]: I1006 13:25:15.410758 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" containerName="nova-api-log" containerID="cri-o://722008c23e4359d8da69118e55467fdca424b690a4d4576d86c718e72b537ac8" gracePeriod=30 Oct 06 13:25:15 crc kubenswrapper[4867]: I1006 13:25:15.410922 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" containerName="nova-api-api" containerID="cri-o://78c0bc7e6c48ed54a99d5cdd1ad13ad8a4d146c1126436340c6103f26b132d32" gracePeriod=30 Oct 06 13:25:15 crc kubenswrapper[4867]: I1006 13:25:15.431774 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:25:15 crc kubenswrapper[4867]: I1006 13:25:15.431996 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="258a8420-c9fc-4115-b575-687ed7d8bc2a" containerName="nova-scheduler-scheduler" containerID="cri-o://bb3e7b5a269e6d21fabdd2390579caa216cd3ac4ee06aa1899194dfa088bb840" gracePeriod=30 Oct 06 13:25:15 crc kubenswrapper[4867]: I1006 13:25:15.447229 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:25:15 crc kubenswrapper[4867]: I1006 13:25:15.450546 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ac98dfd9-17f4-4911-83b7-ae865a97d33c" containerName="nova-metadata-log" containerID="cri-o://bec26e23eeebd5020d214d42fa0ce2e720c9b00656223256fbec8e7e45393d26" gracePeriod=30 Oct 06 13:25:15 crc kubenswrapper[4867]: I1006 13:25:15.450793 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ac98dfd9-17f4-4911-83b7-ae865a97d33c" containerName="nova-metadata-metadata" containerID="cri-o://7771e21c68516537056070cb11c13c31f218e61911593ab9c208bebf4530ad64" gracePeriod=30 Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.258625 4867 generic.go:334] "Generic (PLEG): container finished" podID="ac98dfd9-17f4-4911-83b7-ae865a97d33c" containerID="bec26e23eeebd5020d214d42fa0ce2e720c9b00656223256fbec8e7e45393d26" exitCode=143 Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.258959 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac98dfd9-17f4-4911-83b7-ae865a97d33c","Type":"ContainerDied","Data":"bec26e23eeebd5020d214d42fa0ce2e720c9b00656223256fbec8e7e45393d26"} Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.276001 4867 generic.go:334] "Generic (PLEG): container finished" podID="4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" containerID="722008c23e4359d8da69118e55467fdca424b690a4d4576d86c718e72b537ac8" exitCode=143 Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.276054 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0","Type":"ContainerDied","Data":"722008c23e4359d8da69118e55467fdca424b690a4d4576d86c718e72b537ac8"} Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.773081 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.889720 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-config-data\") pod \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.889783 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh646\" (UniqueName: \"kubernetes.io/projected/ac98dfd9-17f4-4911-83b7-ae865a97d33c-kube-api-access-nh646\") pod \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.889981 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-nova-metadata-tls-certs\") pod \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.890083 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac98dfd9-17f4-4911-83b7-ae865a97d33c-logs\") pod \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.890138 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-combined-ca-bundle\") pod \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\" (UID: \"ac98dfd9-17f4-4911-83b7-ae865a97d33c\") " Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.895188 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac98dfd9-17f4-4911-83b7-ae865a97d33c-logs" (OuterVolumeSpecName: "logs") pod "ac98dfd9-17f4-4911-83b7-ae865a97d33c" (UID: "ac98dfd9-17f4-4911-83b7-ae865a97d33c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.913664 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac98dfd9-17f4-4911-83b7-ae865a97d33c-kube-api-access-nh646" (OuterVolumeSpecName: "kube-api-access-nh646") pod "ac98dfd9-17f4-4911-83b7-ae865a97d33c" (UID: "ac98dfd9-17f4-4911-83b7-ae865a97d33c"). InnerVolumeSpecName "kube-api-access-nh646". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.936517 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-config-data" (OuterVolumeSpecName: "config-data") pod "ac98dfd9-17f4-4911-83b7-ae865a97d33c" (UID: "ac98dfd9-17f4-4911-83b7-ae865a97d33c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.947437 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac98dfd9-17f4-4911-83b7-ae865a97d33c" (UID: "ac98dfd9-17f4-4911-83b7-ae865a97d33c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.987451 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ac98dfd9-17f4-4911-83b7-ae865a97d33c" (UID: "ac98dfd9-17f4-4911-83b7-ae865a97d33c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.993076 4867 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.993118 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac98dfd9-17f4-4911-83b7-ae865a97d33c-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.993131 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.993141 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac98dfd9-17f4-4911-83b7-ae865a97d33c-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:16 crc kubenswrapper[4867]: I1006 13:25:16.993151 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh646\" (UniqueName: \"kubernetes.io/projected/ac98dfd9-17f4-4911-83b7-ae865a97d33c-kube-api-access-nh646\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.352665 4867 generic.go:334] "Generic (PLEG): container finished" podID="258a8420-c9fc-4115-b575-687ed7d8bc2a" containerID="bb3e7b5a269e6d21fabdd2390579caa216cd3ac4ee06aa1899194dfa088bb840" exitCode=0 Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.352791 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"258a8420-c9fc-4115-b575-687ed7d8bc2a","Type":"ContainerDied","Data":"bb3e7b5a269e6d21fabdd2390579caa216cd3ac4ee06aa1899194dfa088bb840"} Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.388659 4867 generic.go:334] "Generic (PLEG): container finished" podID="ac98dfd9-17f4-4911-83b7-ae865a97d33c" containerID="7771e21c68516537056070cb11c13c31f218e61911593ab9c208bebf4530ad64" exitCode=0 Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.388702 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac98dfd9-17f4-4911-83b7-ae865a97d33c","Type":"ContainerDied","Data":"7771e21c68516537056070cb11c13c31f218e61911593ab9c208bebf4530ad64"} Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.388733 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac98dfd9-17f4-4911-83b7-ae865a97d33c","Type":"ContainerDied","Data":"55e5317e27f8e297b0a25e02f7f9009ec72e48f8333717d7282a6075da385cd7"} Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.388752 4867 scope.go:117] "RemoveContainer" containerID="7771e21c68516537056070cb11c13c31f218e61911593ab9c208bebf4530ad64" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.388899 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.432420 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.460300 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.490375 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:25:17 crc kubenswrapper[4867]: E1006 13:25:17.491240 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0330c8-1ee8-44f5-9b86-7c2f242c1294" containerName="init" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.491290 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0330c8-1ee8-44f5-9b86-7c2f242c1294" containerName="init" Oct 06 13:25:17 crc kubenswrapper[4867]: E1006 13:25:17.491329 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac98dfd9-17f4-4911-83b7-ae865a97d33c" containerName="nova-metadata-log" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.491340 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac98dfd9-17f4-4911-83b7-ae865a97d33c" containerName="nova-metadata-log" Oct 06 13:25:17 crc kubenswrapper[4867]: E1006 13:25:17.491370 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0330c8-1ee8-44f5-9b86-7c2f242c1294" containerName="dnsmasq-dns" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.491381 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0330c8-1ee8-44f5-9b86-7c2f242c1294" containerName="dnsmasq-dns" Oct 06 13:25:17 crc kubenswrapper[4867]: E1006 13:25:17.491394 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860" containerName="nova-manage" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.491404 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860" containerName="nova-manage" Oct 06 13:25:17 crc kubenswrapper[4867]: E1006 13:25:17.491435 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac98dfd9-17f4-4911-83b7-ae865a97d33c" containerName="nova-metadata-metadata" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.491443 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac98dfd9-17f4-4911-83b7-ae865a97d33c" containerName="nova-metadata-metadata" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.491717 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860" containerName="nova-manage" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.491739 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac98dfd9-17f4-4911-83b7-ae865a97d33c" containerName="nova-metadata-log" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.491773 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e0330c8-1ee8-44f5-9b86-7c2f242c1294" containerName="dnsmasq-dns" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.491785 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac98dfd9-17f4-4911-83b7-ae865a97d33c" containerName="nova-metadata-metadata" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.493704 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.503023 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.503486 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.504642 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.512784 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/299fc545-42f8-4889-8775-57b7aed64736-logs\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.512839 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw6kv\" (UniqueName: \"kubernetes.io/projected/299fc545-42f8-4889-8775-57b7aed64736-kube-api-access-mw6kv\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.512998 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/299fc545-42f8-4889-8775-57b7aed64736-config-data\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.513036 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/299fc545-42f8-4889-8775-57b7aed64736-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.513092 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299fc545-42f8-4889-8775-57b7aed64736-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.519010 4867 scope.go:117] "RemoveContainer" containerID="bec26e23eeebd5020d214d42fa0ce2e720c9b00656223256fbec8e7e45393d26" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.547537 4867 scope.go:117] "RemoveContainer" containerID="7771e21c68516537056070cb11c13c31f218e61911593ab9c208bebf4530ad64" Oct 06 13:25:17 crc kubenswrapper[4867]: E1006 13:25:17.548067 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7771e21c68516537056070cb11c13c31f218e61911593ab9c208bebf4530ad64\": container with ID starting with 7771e21c68516537056070cb11c13c31f218e61911593ab9c208bebf4530ad64 not found: ID does not exist" containerID="7771e21c68516537056070cb11c13c31f218e61911593ab9c208bebf4530ad64" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.548104 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7771e21c68516537056070cb11c13c31f218e61911593ab9c208bebf4530ad64"} err="failed to get container status \"7771e21c68516537056070cb11c13c31f218e61911593ab9c208bebf4530ad64\": rpc error: code = NotFound desc = could not find container \"7771e21c68516537056070cb11c13c31f218e61911593ab9c208bebf4530ad64\": container with ID starting with 7771e21c68516537056070cb11c13c31f218e61911593ab9c208bebf4530ad64 not found: ID does not exist" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.548132 4867 scope.go:117] "RemoveContainer" containerID="bec26e23eeebd5020d214d42fa0ce2e720c9b00656223256fbec8e7e45393d26" Oct 06 13:25:17 crc kubenswrapper[4867]: E1006 13:25:17.548927 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bec26e23eeebd5020d214d42fa0ce2e720c9b00656223256fbec8e7e45393d26\": container with ID starting with bec26e23eeebd5020d214d42fa0ce2e720c9b00656223256fbec8e7e45393d26 not found: ID does not exist" containerID="bec26e23eeebd5020d214d42fa0ce2e720c9b00656223256fbec8e7e45393d26" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.548983 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bec26e23eeebd5020d214d42fa0ce2e720c9b00656223256fbec8e7e45393d26"} err="failed to get container status \"bec26e23eeebd5020d214d42fa0ce2e720c9b00656223256fbec8e7e45393d26\": rpc error: code = NotFound desc = could not find container \"bec26e23eeebd5020d214d42fa0ce2e720c9b00656223256fbec8e7e45393d26\": container with ID starting with bec26e23eeebd5020d214d42fa0ce2e720c9b00656223256fbec8e7e45393d26 not found: ID does not exist" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.616444 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/299fc545-42f8-4889-8775-57b7aed64736-config-data\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.616506 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/299fc545-42f8-4889-8775-57b7aed64736-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.616562 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299fc545-42f8-4889-8775-57b7aed64736-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.616624 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/299fc545-42f8-4889-8775-57b7aed64736-logs\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.616651 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw6kv\" (UniqueName: \"kubernetes.io/projected/299fc545-42f8-4889-8775-57b7aed64736-kube-api-access-mw6kv\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.624937 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/299fc545-42f8-4889-8775-57b7aed64736-logs\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.626862 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/299fc545-42f8-4889-8775-57b7aed64736-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.627171 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/299fc545-42f8-4889-8775-57b7aed64736-config-data\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.632901 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/299fc545-42f8-4889-8775-57b7aed64736-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.633436 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw6kv\" (UniqueName: \"kubernetes.io/projected/299fc545-42f8-4889-8775-57b7aed64736-kube-api-access-mw6kv\") pod \"nova-metadata-0\" (UID: \"299fc545-42f8-4889-8775-57b7aed64736\") " pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.749420 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.821758 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258a8420-c9fc-4115-b575-687ed7d8bc2a-combined-ca-bundle\") pod \"258a8420-c9fc-4115-b575-687ed7d8bc2a\" (UID: \"258a8420-c9fc-4115-b575-687ed7d8bc2a\") " Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.821830 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258a8420-c9fc-4115-b575-687ed7d8bc2a-config-data\") pod \"258a8420-c9fc-4115-b575-687ed7d8bc2a\" (UID: \"258a8420-c9fc-4115-b575-687ed7d8bc2a\") " Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.821936 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xrhj\" (UniqueName: \"kubernetes.io/projected/258a8420-c9fc-4115-b575-687ed7d8bc2a-kube-api-access-5xrhj\") pod \"258a8420-c9fc-4115-b575-687ed7d8bc2a\" (UID: \"258a8420-c9fc-4115-b575-687ed7d8bc2a\") " Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.837593 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258a8420-c9fc-4115-b575-687ed7d8bc2a-kube-api-access-5xrhj" (OuterVolumeSpecName: "kube-api-access-5xrhj") pod "258a8420-c9fc-4115-b575-687ed7d8bc2a" (UID: "258a8420-c9fc-4115-b575-687ed7d8bc2a"). InnerVolumeSpecName "kube-api-access-5xrhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.844355 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.864444 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258a8420-c9fc-4115-b575-687ed7d8bc2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "258a8420-c9fc-4115-b575-687ed7d8bc2a" (UID: "258a8420-c9fc-4115-b575-687ed7d8bc2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.887892 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258a8420-c9fc-4115-b575-687ed7d8bc2a-config-data" (OuterVolumeSpecName: "config-data") pod "258a8420-c9fc-4115-b575-687ed7d8bc2a" (UID: "258a8420-c9fc-4115-b575-687ed7d8bc2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.924024 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xrhj\" (UniqueName: \"kubernetes.io/projected/258a8420-c9fc-4115-b575-687ed7d8bc2a-kube-api-access-5xrhj\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.924056 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258a8420-c9fc-4115-b575-687ed7d8bc2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:17 crc kubenswrapper[4867]: I1006 13:25:17.924067 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258a8420-c9fc-4115-b575-687ed7d8bc2a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.325991 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.411103 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"258a8420-c9fc-4115-b575-687ed7d8bc2a","Type":"ContainerDied","Data":"844118a0c98e39417150df5c29c44079e7ebd877ec9ba8bc2a0f4b19a95a8f99"} Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.411168 4867 scope.go:117] "RemoveContainer" containerID="bb3e7b5a269e6d21fabdd2390579caa216cd3ac4ee06aa1899194dfa088bb840" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.411125 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.414084 4867 generic.go:334] "Generic (PLEG): container finished" podID="4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" containerID="78c0bc7e6c48ed54a99d5cdd1ad13ad8a4d146c1126436340c6103f26b132d32" exitCode=0 Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.414125 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0","Type":"ContainerDied","Data":"78c0bc7e6c48ed54a99d5cdd1ad13ad8a4d146c1126436340c6103f26b132d32"} Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.458685 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.475222 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.515854 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:25:18 crc kubenswrapper[4867]: E1006 13:25:18.516978 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258a8420-c9fc-4115-b575-687ed7d8bc2a" containerName="nova-scheduler-scheduler" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.517019 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="258a8420-c9fc-4115-b575-687ed7d8bc2a" containerName="nova-scheduler-scheduler" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.517604 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="258a8420-c9fc-4115-b575-687ed7d8bc2a" containerName="nova-scheduler-scheduler" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.518863 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.522014 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.524933 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.540830 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90a63ca-3da5-420b-b2ac-b17f116f0c84-config-data\") pod \"nova-scheduler-0\" (UID: \"d90a63ca-3da5-420b-b2ac-b17f116f0c84\") " pod="openstack/nova-scheduler-0" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.541377 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z62n\" (UniqueName: \"kubernetes.io/projected/d90a63ca-3da5-420b-b2ac-b17f116f0c84-kube-api-access-8z62n\") pod \"nova-scheduler-0\" (UID: \"d90a63ca-3da5-420b-b2ac-b17f116f0c84\") " pod="openstack/nova-scheduler-0" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.541615 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90a63ca-3da5-420b-b2ac-b17f116f0c84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d90a63ca-3da5-420b-b2ac-b17f116f0c84\") " pod="openstack/nova-scheduler-0" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.644569 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90a63ca-3da5-420b-b2ac-b17f116f0c84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d90a63ca-3da5-420b-b2ac-b17f116f0c84\") " pod="openstack/nova-scheduler-0" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.644743 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90a63ca-3da5-420b-b2ac-b17f116f0c84-config-data\") pod \"nova-scheduler-0\" (UID: \"d90a63ca-3da5-420b-b2ac-b17f116f0c84\") " pod="openstack/nova-scheduler-0" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.644930 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z62n\" (UniqueName: \"kubernetes.io/projected/d90a63ca-3da5-420b-b2ac-b17f116f0c84-kube-api-access-8z62n\") pod \"nova-scheduler-0\" (UID: \"d90a63ca-3da5-420b-b2ac-b17f116f0c84\") " pod="openstack/nova-scheduler-0" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.654913 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d90a63ca-3da5-420b-b2ac-b17f116f0c84-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d90a63ca-3da5-420b-b2ac-b17f116f0c84\") " pod="openstack/nova-scheduler-0" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.656927 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d90a63ca-3da5-420b-b2ac-b17f116f0c84-config-data\") pod \"nova-scheduler-0\" (UID: \"d90a63ca-3da5-420b-b2ac-b17f116f0c84\") " pod="openstack/nova-scheduler-0" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.664017 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z62n\" (UniqueName: \"kubernetes.io/projected/d90a63ca-3da5-420b-b2ac-b17f116f0c84-kube-api-access-8z62n\") pod \"nova-scheduler-0\" (UID: \"d90a63ca-3da5-420b-b2ac-b17f116f0c84\") " pod="openstack/nova-scheduler-0" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.866679 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.889971 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.951159 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-combined-ca-bundle\") pod \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.951205 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwctk\" (UniqueName: \"kubernetes.io/projected/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-kube-api-access-dwctk\") pod \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.951314 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-public-tls-certs\") pod \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.951456 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-config-data\") pod \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.951494 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-internal-tls-certs\") pod \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.951544 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-logs\") pod \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\" (UID: \"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0\") " Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.952538 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-logs" (OuterVolumeSpecName: "logs") pod "4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" (UID: "4c6d8d65-492a-485c-b9f0-4ae0173e5eb0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.956601 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-kube-api-access-dwctk" (OuterVolumeSpecName: "kube-api-access-dwctk") pod "4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" (UID: "4c6d8d65-492a-485c-b9f0-4ae0173e5eb0"). InnerVolumeSpecName "kube-api-access-dwctk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:25:18 crc kubenswrapper[4867]: I1006 13:25:18.991141 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-config-data" (OuterVolumeSpecName: "config-data") pod "4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" (UID: "4c6d8d65-492a-485c-b9f0-4ae0173e5eb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.001571 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" (UID: "4c6d8d65-492a-485c-b9f0-4ae0173e5eb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.016687 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" (UID: "4c6d8d65-492a-485c-b9f0-4ae0173e5eb0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.037485 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" (UID: "4c6d8d65-492a-485c-b9f0-4ae0173e5eb0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.056574 4867 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-logs\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.056914 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.056958 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwctk\" (UniqueName: \"kubernetes.io/projected/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-kube-api-access-dwctk\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.056982 4867 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.056995 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.057031 4867 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.233529 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="258a8420-c9fc-4115-b575-687ed7d8bc2a" path="/var/lib/kubelet/pods/258a8420-c9fc-4115-b575-687ed7d8bc2a/volumes" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.234211 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac98dfd9-17f4-4911-83b7-ae865a97d33c" path="/var/lib/kubelet/pods/ac98dfd9-17f4-4911-83b7-ae865a97d33c/volumes" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.353035 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 13:25:19 crc kubenswrapper[4867]: W1006 13:25:19.362172 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd90a63ca_3da5_420b_b2ac_b17f116f0c84.slice/crio-d938028ba4fc92453356b29a6d94a1727d38646a3a27e42762c6dd58c73a5b44 WatchSource:0}: Error finding container d938028ba4fc92453356b29a6d94a1727d38646a3a27e42762c6dd58c73a5b44: Status 404 returned error can't find the container with id d938028ba4fc92453356b29a6d94a1727d38646a3a27e42762c6dd58c73a5b44 Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.425951 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c6d8d65-492a-485c-b9f0-4ae0173e5eb0","Type":"ContainerDied","Data":"844f456527d59428924d09f6278879aa13d8aa18f3b7da682fdce8784dd4b325"} Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.426001 4867 scope.go:117] "RemoveContainer" containerID="78c0bc7e6c48ed54a99d5cdd1ad13ad8a4d146c1126436340c6103f26b132d32" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.426149 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.430602 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"299fc545-42f8-4889-8775-57b7aed64736","Type":"ContainerStarted","Data":"79948b917ffa248e26165605d9a7cb01e761e943f848cbb7b15a566ac090bcf7"} Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.430634 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"299fc545-42f8-4889-8775-57b7aed64736","Type":"ContainerStarted","Data":"9ed80b16f30e541566ec6de1c37d32731562387367b57690182ce9317601b6e8"} Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.430643 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"299fc545-42f8-4889-8775-57b7aed64736","Type":"ContainerStarted","Data":"eb6ecfee826f2dc6a6bb406418072467d5b8718dc5033abf67188cd7062979e5"} Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.433540 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d90a63ca-3da5-420b-b2ac-b17f116f0c84","Type":"ContainerStarted","Data":"d938028ba4fc92453356b29a6d94a1727d38646a3a27e42762c6dd58c73a5b44"} Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.455495 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.486112 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.498645 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.498624196 podStartE2EDuration="2.498624196s" podCreationTimestamp="2025-10-06 13:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:25:19.455754685 +0000 UTC m=+1298.913702839" watchObservedRunningTime="2025-10-06 13:25:19.498624196 +0000 UTC m=+1298.956572340" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.520478 4867 scope.go:117] "RemoveContainer" containerID="722008c23e4359d8da69118e55467fdca424b690a4d4576d86c718e72b537ac8" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.528451 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 13:25:19 crc kubenswrapper[4867]: E1006 13:25:19.528906 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" containerName="nova-api-api" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.528920 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" containerName="nova-api-api" Oct 06 13:25:19 crc kubenswrapper[4867]: E1006 13:25:19.528953 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" containerName="nova-api-log" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.528959 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" containerName="nova-api-log" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.529135 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" containerName="nova-api-api" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.529147 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" containerName="nova-api-log" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.530389 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.534166 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.534696 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.534867 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.537984 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.579015 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/165ccfff-2554-4af2-8ca4-be0c49e7daa8-public-tls-certs\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.579071 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/165ccfff-2554-4af2-8ca4-be0c49e7daa8-logs\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.579209 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165ccfff-2554-4af2-8ca4-be0c49e7daa8-config-data\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.579286 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/165ccfff-2554-4af2-8ca4-be0c49e7daa8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.579372 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvbpp\" (UniqueName: \"kubernetes.io/projected/165ccfff-2554-4af2-8ca4-be0c49e7daa8-kube-api-access-gvbpp\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.579456 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165ccfff-2554-4af2-8ca4-be0c49e7daa8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.681188 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165ccfff-2554-4af2-8ca4-be0c49e7daa8-config-data\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.681299 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/165ccfff-2554-4af2-8ca4-be0c49e7daa8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.681321 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvbpp\" (UniqueName: \"kubernetes.io/projected/165ccfff-2554-4af2-8ca4-be0c49e7daa8-kube-api-access-gvbpp\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.681345 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165ccfff-2554-4af2-8ca4-be0c49e7daa8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.681387 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/165ccfff-2554-4af2-8ca4-be0c49e7daa8-public-tls-certs\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.681412 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/165ccfff-2554-4af2-8ca4-be0c49e7daa8-logs\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.681888 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/165ccfff-2554-4af2-8ca4-be0c49e7daa8-logs\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.684511 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/165ccfff-2554-4af2-8ca4-be0c49e7daa8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.685339 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/165ccfff-2554-4af2-8ca4-be0c49e7daa8-config-data\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.693990 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/165ccfff-2554-4af2-8ca4-be0c49e7daa8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.694609 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/165ccfff-2554-4af2-8ca4-be0c49e7daa8-public-tls-certs\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.703346 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvbpp\" (UniqueName: \"kubernetes.io/projected/165ccfff-2554-4af2-8ca4-be0c49e7daa8-kube-api-access-gvbpp\") pod \"nova-api-0\" (UID: \"165ccfff-2554-4af2-8ca4-be0c49e7daa8\") " pod="openstack/nova-api-0" Oct 06 13:25:19 crc kubenswrapper[4867]: I1006 13:25:19.851364 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 13:25:20 crc kubenswrapper[4867]: I1006 13:25:20.356582 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 13:25:20 crc kubenswrapper[4867]: W1006 13:25:20.360470 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod165ccfff_2554_4af2_8ca4_be0c49e7daa8.slice/crio-45ade796ac1bad2451664b5d1101b39b77c0892ddff399543b58a1d8e3d03f2e WatchSource:0}: Error finding container 45ade796ac1bad2451664b5d1101b39b77c0892ddff399543b58a1d8e3d03f2e: Status 404 returned error can't find the container with id 45ade796ac1bad2451664b5d1101b39b77c0892ddff399543b58a1d8e3d03f2e Oct 06 13:25:20 crc kubenswrapper[4867]: I1006 13:25:20.466374 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"165ccfff-2554-4af2-8ca4-be0c49e7daa8","Type":"ContainerStarted","Data":"45ade796ac1bad2451664b5d1101b39b77c0892ddff399543b58a1d8e3d03f2e"} Oct 06 13:25:20 crc kubenswrapper[4867]: I1006 13:25:20.469148 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d90a63ca-3da5-420b-b2ac-b17f116f0c84","Type":"ContainerStarted","Data":"6c81e1e23c80cef6523de0c219c545dbfb3a05149f9e2ffc4c11b1a378690f78"} Oct 06 13:25:20 crc kubenswrapper[4867]: I1006 13:25:20.493245 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.493225162 podStartE2EDuration="2.493225162s" podCreationTimestamp="2025-10-06 13:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:25:20.488734989 +0000 UTC m=+1299.946683143" watchObservedRunningTime="2025-10-06 13:25:20.493225162 +0000 UTC m=+1299.951173316" Oct 06 13:25:21 crc kubenswrapper[4867]: I1006 13:25:21.249725 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c6d8d65-492a-485c-b9f0-4ae0173e5eb0" path="/var/lib/kubelet/pods/4c6d8d65-492a-485c-b9f0-4ae0173e5eb0/volumes" Oct 06 13:25:21 crc kubenswrapper[4867]: I1006 13:25:21.488701 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"165ccfff-2554-4af2-8ca4-be0c49e7daa8","Type":"ContainerStarted","Data":"6893836d4c69054387c94ce68098014b29e0c5a03af4dd35f31ec328a5ba7709"} Oct 06 13:25:21 crc kubenswrapper[4867]: I1006 13:25:21.488750 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"165ccfff-2554-4af2-8ca4-be0c49e7daa8","Type":"ContainerStarted","Data":"d38ffe5f751a7d12dc5148bbbb7c8f15046c98f10ce3ed08fef8c56e6665a9b7"} Oct 06 13:25:22 crc kubenswrapper[4867]: I1006 13:25:22.845838 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 13:25:22 crc kubenswrapper[4867]: I1006 13:25:22.846418 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 13:25:23 crc kubenswrapper[4867]: I1006 13:25:23.867232 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 13:25:27 crc kubenswrapper[4867]: I1006 13:25:27.846413 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 13:25:27 crc kubenswrapper[4867]: I1006 13:25:27.847325 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 13:25:28 crc kubenswrapper[4867]: I1006 13:25:28.861433 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="299fc545-42f8-4889-8775-57b7aed64736" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 13:25:28 crc kubenswrapper[4867]: I1006 13:25:28.861574 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="299fc545-42f8-4889-8775-57b7aed64736" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 13:25:28 crc kubenswrapper[4867]: I1006 13:25:28.866826 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 13:25:28 crc kubenswrapper[4867]: I1006 13:25:28.921859 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 13:25:28 crc kubenswrapper[4867]: I1006 13:25:28.957695 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=9.957651731 podStartE2EDuration="9.957651731s" podCreationTimestamp="2025-10-06 13:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:25:21.511600077 +0000 UTC m=+1300.969548221" watchObservedRunningTime="2025-10-06 13:25:28.957651731 +0000 UTC m=+1308.415599905" Oct 06 13:25:29 crc kubenswrapper[4867]: I1006 13:25:29.616686 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 13:25:29 crc kubenswrapper[4867]: I1006 13:25:29.851806 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 13:25:29 crc kubenswrapper[4867]: I1006 13:25:29.851903 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 13:25:30 crc kubenswrapper[4867]: I1006 13:25:30.879653 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="165ccfff-2554-4af2-8ca4-be0c49e7daa8" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.227:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 13:25:30 crc kubenswrapper[4867]: I1006 13:25:30.879664 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="165ccfff-2554-4af2-8ca4-be0c49e7daa8" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.227:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 13:25:35 crc kubenswrapper[4867]: I1006 13:25:35.742918 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 13:25:37 crc kubenswrapper[4867]: I1006 13:25:37.854746 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 13:25:37 crc kubenswrapper[4867]: I1006 13:25:37.859983 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 13:25:37 crc kubenswrapper[4867]: I1006 13:25:37.861679 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 13:25:38 crc kubenswrapper[4867]: I1006 13:25:38.709174 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 13:25:39 crc kubenswrapper[4867]: I1006 13:25:39.866098 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 13:25:39 crc kubenswrapper[4867]: I1006 13:25:39.866188 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 13:25:39 crc kubenswrapper[4867]: I1006 13:25:39.866632 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 13:25:39 crc kubenswrapper[4867]: I1006 13:25:39.866656 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 13:25:39 crc kubenswrapper[4867]: I1006 13:25:39.874966 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 13:25:39 crc kubenswrapper[4867]: I1006 13:25:39.880812 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 13:25:48 crc kubenswrapper[4867]: I1006 13:25:48.027753 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 13:25:49 crc kubenswrapper[4867]: I1006 13:25:49.047031 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 13:25:51 crc kubenswrapper[4867]: I1006 13:25:51.999211 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="acd7f8b9-810f-4e76-b971-c466bf7d4a5b" containerName="rabbitmq" containerID="cri-o://6a3bf3ee75000bacfe61c88134793e16f0291f997758d5889bd76b010a207d68" gracePeriod=604797 Oct 06 13:25:52 crc kubenswrapper[4867]: I1006 13:25:52.568463 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="9f249bfb-ab86-491d-9d1c-b3930fdea27d" containerName="rabbitmq" containerID="cri-o://af16685efe24a8c80bf1e736c7f8f054bede6974e494d223cd487ec58a9fbfa8" gracePeriod=604797 Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.657138 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.735090 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-pod-info\") pod \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.735178 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-server-conf\") pod \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.735362 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-confd\") pod \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.735396 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9svjq\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-kube-api-access-9svjq\") pod \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.735458 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-plugins-conf\") pod \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.735513 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-erlang-cookie-secret\") pod \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.735605 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-tls\") pod \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.735671 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.735756 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-config-data\") pod \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.735848 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-plugins\") pod \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.735929 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-erlang-cookie\") pod \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\" (UID: \"acd7f8b9-810f-4e76-b971-c466bf7d4a5b\") " Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.737542 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "acd7f8b9-810f-4e76-b971-c466bf7d4a5b" (UID: "acd7f8b9-810f-4e76-b971-c466bf7d4a5b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.742398 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "acd7f8b9-810f-4e76-b971-c466bf7d4a5b" (UID: "acd7f8b9-810f-4e76-b971-c466bf7d4a5b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.745763 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-kube-api-access-9svjq" (OuterVolumeSpecName: "kube-api-access-9svjq") pod "acd7f8b9-810f-4e76-b971-c466bf7d4a5b" (UID: "acd7f8b9-810f-4e76-b971-c466bf7d4a5b"). InnerVolumeSpecName "kube-api-access-9svjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.746117 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "acd7f8b9-810f-4e76-b971-c466bf7d4a5b" (UID: "acd7f8b9-810f-4e76-b971-c466bf7d4a5b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.747315 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "acd7f8b9-810f-4e76-b971-c466bf7d4a5b" (UID: "acd7f8b9-810f-4e76-b971-c466bf7d4a5b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.747714 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "acd7f8b9-810f-4e76-b971-c466bf7d4a5b" (UID: "acd7f8b9-810f-4e76-b971-c466bf7d4a5b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.751714 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "acd7f8b9-810f-4e76-b971-c466bf7d4a5b" (UID: "acd7f8b9-810f-4e76-b971-c466bf7d4a5b"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.756801 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-pod-info" (OuterVolumeSpecName: "pod-info") pod "acd7f8b9-810f-4e76-b971-c466bf7d4a5b" (UID: "acd7f8b9-810f-4e76-b971-c466bf7d4a5b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.807715 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-config-data" (OuterVolumeSpecName: "config-data") pod "acd7f8b9-810f-4e76-b971-c466bf7d4a5b" (UID: "acd7f8b9-810f-4e76-b971-c466bf7d4a5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.841326 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.841362 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.841374 4867 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.841382 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9svjq\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-kube-api-access-9svjq\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.841391 4867 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.841398 4867 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.841406 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.841428 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.841437 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.863813 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-server-conf" (OuterVolumeSpecName: "server-conf") pod "acd7f8b9-810f-4e76-b971-c466bf7d4a5b" (UID: "acd7f8b9-810f-4e76-b971-c466bf7d4a5b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.896946 4867 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.898833 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f249bfb-ab86-491d-9d1c-b3930fdea27d" containerID="af16685efe24a8c80bf1e736c7f8f054bede6974e494d223cd487ec58a9fbfa8" exitCode=0 Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.898986 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9f249bfb-ab86-491d-9d1c-b3930fdea27d","Type":"ContainerDied","Data":"af16685efe24a8c80bf1e736c7f8f054bede6974e494d223cd487ec58a9fbfa8"} Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.906348 4867 generic.go:334] "Generic (PLEG): container finished" podID="acd7f8b9-810f-4e76-b971-c466bf7d4a5b" containerID="6a3bf3ee75000bacfe61c88134793e16f0291f997758d5889bd76b010a207d68" exitCode=0 Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.906382 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"acd7f8b9-810f-4e76-b971-c466bf7d4a5b","Type":"ContainerDied","Data":"6a3bf3ee75000bacfe61c88134793e16f0291f997758d5889bd76b010a207d68"} Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.906400 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"acd7f8b9-810f-4e76-b971-c466bf7d4a5b","Type":"ContainerDied","Data":"e5088680bb2cbc26e4df1b53ba9ec79152a37c9f2acbb99888ad7bdc7995ec7e"} Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.906424 4867 scope.go:117] "RemoveContainer" containerID="6a3bf3ee75000bacfe61c88134793e16f0291f997758d5889bd76b010a207d68" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.906636 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.930645 4867 scope.go:117] "RemoveContainer" containerID="20db4aef3d1f6f5beead11f0cb913ab6379935aac4adfe39811cc35e1f27258f" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.943873 4867 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.944266 4867 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.952427 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "acd7f8b9-810f-4e76-b971-c466bf7d4a5b" (UID: "acd7f8b9-810f-4e76-b971-c466bf7d4a5b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.969384 4867 scope.go:117] "RemoveContainer" containerID="6a3bf3ee75000bacfe61c88134793e16f0291f997758d5889bd76b010a207d68" Oct 06 13:25:53 crc kubenswrapper[4867]: E1006 13:25:53.973375 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a3bf3ee75000bacfe61c88134793e16f0291f997758d5889bd76b010a207d68\": container with ID starting with 6a3bf3ee75000bacfe61c88134793e16f0291f997758d5889bd76b010a207d68 not found: ID does not exist" containerID="6a3bf3ee75000bacfe61c88134793e16f0291f997758d5889bd76b010a207d68" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.973416 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a3bf3ee75000bacfe61c88134793e16f0291f997758d5889bd76b010a207d68"} err="failed to get container status \"6a3bf3ee75000bacfe61c88134793e16f0291f997758d5889bd76b010a207d68\": rpc error: code = NotFound desc = could not find container \"6a3bf3ee75000bacfe61c88134793e16f0291f997758d5889bd76b010a207d68\": container with ID starting with 6a3bf3ee75000bacfe61c88134793e16f0291f997758d5889bd76b010a207d68 not found: ID does not exist" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.973443 4867 scope.go:117] "RemoveContainer" containerID="20db4aef3d1f6f5beead11f0cb913ab6379935aac4adfe39811cc35e1f27258f" Oct 06 13:25:53 crc kubenswrapper[4867]: E1006 13:25:53.990949 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20db4aef3d1f6f5beead11f0cb913ab6379935aac4adfe39811cc35e1f27258f\": container with ID starting with 20db4aef3d1f6f5beead11f0cb913ab6379935aac4adfe39811cc35e1f27258f not found: ID does not exist" containerID="20db4aef3d1f6f5beead11f0cb913ab6379935aac4adfe39811cc35e1f27258f" Oct 06 13:25:53 crc kubenswrapper[4867]: I1006 13:25:53.991035 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20db4aef3d1f6f5beead11f0cb913ab6379935aac4adfe39811cc35e1f27258f"} err="failed to get container status \"20db4aef3d1f6f5beead11f0cb913ab6379935aac4adfe39811cc35e1f27258f\": rpc error: code = NotFound desc = could not find container \"20db4aef3d1f6f5beead11f0cb913ab6379935aac4adfe39811cc35e1f27258f\": container with ID starting with 20db4aef3d1f6f5beead11f0cb913ab6379935aac4adfe39811cc35e1f27258f not found: ID does not exist" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.046483 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/acd7f8b9-810f-4e76-b971-c466bf7d4a5b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.135573 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.250410 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.250497 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-erlang-cookie\") pod \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.251517 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9f249bfb-ab86-491d-9d1c-b3930fdea27d" (UID: "9f249bfb-ab86-491d-9d1c-b3930fdea27d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.251840 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-server-conf\") pod \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.251933 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkc7h\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-kube-api-access-zkc7h\") pod \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.251977 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-tls\") pod \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.252058 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-plugins-conf\") pod \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.252689 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f249bfb-ab86-491d-9d1c-b3930fdea27d-pod-info\") pod \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.252753 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-plugins\") pod \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.252741 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9f249bfb-ab86-491d-9d1c-b3930fdea27d" (UID: "9f249bfb-ab86-491d-9d1c-b3930fdea27d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.252785 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-confd\") pod \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.252826 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f249bfb-ab86-491d-9d1c-b3930fdea27d-erlang-cookie-secret\") pod \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.253290 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9f249bfb-ab86-491d-9d1c-b3930fdea27d" (UID: "9f249bfb-ab86-491d-9d1c-b3930fdea27d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.253351 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-config-data\") pod \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\" (UID: \"9f249bfb-ab86-491d-9d1c-b3930fdea27d\") " Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.255618 4867 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.255637 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.255647 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.257998 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "9f249bfb-ab86-491d-9d1c-b3930fdea27d" (UID: "9f249bfb-ab86-491d-9d1c-b3930fdea27d"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.258051 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9f249bfb-ab86-491d-9d1c-b3930fdea27d-pod-info" (OuterVolumeSpecName: "pod-info") pod "9f249bfb-ab86-491d-9d1c-b3930fdea27d" (UID: "9f249bfb-ab86-491d-9d1c-b3930fdea27d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.259756 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9f249bfb-ab86-491d-9d1c-b3930fdea27d" (UID: "9f249bfb-ab86-491d-9d1c-b3930fdea27d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.267035 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f249bfb-ab86-491d-9d1c-b3930fdea27d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9f249bfb-ab86-491d-9d1c-b3930fdea27d" (UID: "9f249bfb-ab86-491d-9d1c-b3930fdea27d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.275051 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-kube-api-access-zkc7h" (OuterVolumeSpecName: "kube-api-access-zkc7h") pod "9f249bfb-ab86-491d-9d1c-b3930fdea27d" (UID: "9f249bfb-ab86-491d-9d1c-b3930fdea27d"). InnerVolumeSpecName "kube-api-access-zkc7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.324442 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-config-data" (OuterVolumeSpecName: "config-data") pod "9f249bfb-ab86-491d-9d1c-b3930fdea27d" (UID: "9f249bfb-ab86-491d-9d1c-b3930fdea27d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.331879 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-server-conf" (OuterVolumeSpecName: "server-conf") pod "9f249bfb-ab86-491d-9d1c-b3930fdea27d" (UID: "9f249bfb-ab86-491d-9d1c-b3930fdea27d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.357288 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.357327 4867 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.357340 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkc7h\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-kube-api-access-zkc7h\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.357351 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.357361 4867 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f249bfb-ab86-491d-9d1c-b3930fdea27d-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.357369 4867 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f249bfb-ab86-491d-9d1c-b3930fdea27d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.357377 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f249bfb-ab86-491d-9d1c-b3930fdea27d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.380965 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9f249bfb-ab86-491d-9d1c-b3930fdea27d" (UID: "9f249bfb-ab86-491d-9d1c-b3930fdea27d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.390038 4867 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.459264 4867 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.459314 4867 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f249bfb-ab86-491d-9d1c-b3930fdea27d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.476729 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.491371 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.499206 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 13:25:54 crc kubenswrapper[4867]: E1006 13:25:54.499697 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd7f8b9-810f-4e76-b971-c466bf7d4a5b" containerName="setup-container" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.499717 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd7f8b9-810f-4e76-b971-c466bf7d4a5b" containerName="setup-container" Oct 06 13:25:54 crc kubenswrapper[4867]: E1006 13:25:54.499733 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f249bfb-ab86-491d-9d1c-b3930fdea27d" containerName="rabbitmq" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.499738 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f249bfb-ab86-491d-9d1c-b3930fdea27d" containerName="rabbitmq" Oct 06 13:25:54 crc kubenswrapper[4867]: E1006 13:25:54.499746 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd7f8b9-810f-4e76-b971-c466bf7d4a5b" containerName="rabbitmq" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.499752 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd7f8b9-810f-4e76-b971-c466bf7d4a5b" containerName="rabbitmq" Oct 06 13:25:54 crc kubenswrapper[4867]: E1006 13:25:54.499773 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f249bfb-ab86-491d-9d1c-b3930fdea27d" containerName="setup-container" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.499779 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f249bfb-ab86-491d-9d1c-b3930fdea27d" containerName="setup-container" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.499974 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd7f8b9-810f-4e76-b971-c466bf7d4a5b" containerName="rabbitmq" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.499994 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f249bfb-ab86-491d-9d1c-b3930fdea27d" containerName="rabbitmq" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.501817 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.504431 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.504478 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.504525 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.504594 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.505039 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.505399 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.505411 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bjnpx" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.525690 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.561479 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.561555 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.561602 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.561633 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.561657 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-config-data\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.561686 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q87c\" (UniqueName: \"kubernetes.io/projected/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-kube-api-access-9q87c\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.561797 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.561855 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.561881 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.561905 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.565124 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.667358 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.667718 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.667763 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.667792 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.667810 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-config-data\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.667834 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q87c\" (UniqueName: \"kubernetes.io/projected/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-kube-api-access-9q87c\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.667874 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.667921 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.667948 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.667977 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.668039 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.668320 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.668348 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.669014 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.669127 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.670860 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.673873 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.674036 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.674812 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-config-data\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.675387 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.682723 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.686502 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q87c\" (UniqueName: \"kubernetes.io/projected/5d0a4a4a-9d75-4d2b-aeb8-1903093398d0-kube-api-access-9q87c\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.712358 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0\") " pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.826210 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.943972 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.944202 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9f249bfb-ab86-491d-9d1c-b3930fdea27d","Type":"ContainerDied","Data":"95afb45fcc88dfad0aa8c21882c027abe8b7fa9d313b2829d338b3ecdf624847"} Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.944320 4867 scope.go:117] "RemoveContainer" containerID="af16685efe24a8c80bf1e736c7f8f054bede6974e494d223cd487ec58a9fbfa8" Oct 06 13:25:54 crc kubenswrapper[4867]: I1006 13:25:54.977625 4867 scope.go:117] "RemoveContainer" containerID="ed05e3e744e02f905cc03b57571a1f3fbc2df72ee0c2ceefc3ed6b911570636c" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.042786 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.062471 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.072524 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.074520 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.079438 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.079684 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.079817 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.080060 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.080176 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-m7h52" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.081229 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.082977 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.091515 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.186959 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6669c79a-e288-4d00-8add-bffd6b33b8b9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.187024 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6669c79a-e288-4d00-8add-bffd6b33b8b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.187048 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6669c79a-e288-4d00-8add-bffd6b33b8b9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.187126 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6669c79a-e288-4d00-8add-bffd6b33b8b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.187187 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6669c79a-e288-4d00-8add-bffd6b33b8b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.187227 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6669c79a-e288-4d00-8add-bffd6b33b8b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.187270 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.187294 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6669c79a-e288-4d00-8add-bffd6b33b8b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.187332 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rgzs\" (UniqueName: \"kubernetes.io/projected/6669c79a-e288-4d00-8add-bffd6b33b8b9-kube-api-access-9rgzs\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.187348 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6669c79a-e288-4d00-8add-bffd6b33b8b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.187384 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6669c79a-e288-4d00-8add-bffd6b33b8b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.234667 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f249bfb-ab86-491d-9d1c-b3930fdea27d" path="/var/lib/kubelet/pods/9f249bfb-ab86-491d-9d1c-b3930fdea27d/volumes" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.235813 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acd7f8b9-810f-4e76-b971-c466bf7d4a5b" path="/var/lib/kubelet/pods/acd7f8b9-810f-4e76-b971-c466bf7d4a5b/volumes" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.289912 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6669c79a-e288-4d00-8add-bffd6b33b8b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.290031 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6669c79a-e288-4d00-8add-bffd6b33b8b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.290083 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.290106 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6669c79a-e288-4d00-8add-bffd6b33b8b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.290157 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rgzs\" (UniqueName: \"kubernetes.io/projected/6669c79a-e288-4d00-8add-bffd6b33b8b9-kube-api-access-9rgzs\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.290174 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6669c79a-e288-4d00-8add-bffd6b33b8b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.290234 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6669c79a-e288-4d00-8add-bffd6b33b8b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.290298 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6669c79a-e288-4d00-8add-bffd6b33b8b9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.290353 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6669c79a-e288-4d00-8add-bffd6b33b8b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.290379 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6669c79a-e288-4d00-8add-bffd6b33b8b9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.290434 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.290963 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6669c79a-e288-4d00-8add-bffd6b33b8b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.291522 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6669c79a-e288-4d00-8add-bffd6b33b8b9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.291632 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6669c79a-e288-4d00-8add-bffd6b33b8b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.292166 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6669c79a-e288-4d00-8add-bffd6b33b8b9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.290440 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6669c79a-e288-4d00-8add-bffd6b33b8b9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.294423 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6669c79a-e288-4d00-8add-bffd6b33b8b9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.294817 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6669c79a-e288-4d00-8add-bffd6b33b8b9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.295144 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6669c79a-e288-4d00-8add-bffd6b33b8b9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.295362 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6669c79a-e288-4d00-8add-bffd6b33b8b9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.305770 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6669c79a-e288-4d00-8add-bffd6b33b8b9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.308974 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rgzs\" (UniqueName: \"kubernetes.io/projected/6669c79a-e288-4d00-8add-bffd6b33b8b9-kube-api-access-9rgzs\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.322105 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6669c79a-e288-4d00-8add-bffd6b33b8b9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.368783 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.405045 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.905648 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 13:25:55 crc kubenswrapper[4867]: W1006 13:25:55.912384 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6669c79a_e288_4d00_8add_bffd6b33b8b9.slice/crio-4518a5f5c76334311dc70cd6304f7908c62a78b6a3bf0d4f93cb0449bbed31b1 WatchSource:0}: Error finding container 4518a5f5c76334311dc70cd6304f7908c62a78b6a3bf0d4f93cb0449bbed31b1: Status 404 returned error can't find the container with id 4518a5f5c76334311dc70cd6304f7908c62a78b6a3bf0d4f93cb0449bbed31b1 Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.963394 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6669c79a-e288-4d00-8add-bffd6b33b8b9","Type":"ContainerStarted","Data":"4518a5f5c76334311dc70cd6304f7908c62a78b6a3bf0d4f93cb0449bbed31b1"} Oct 06 13:25:55 crc kubenswrapper[4867]: I1006 13:25:55.965746 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0","Type":"ContainerStarted","Data":"effe5b811b5b3892254e551bdbdaaa21ec76ce889091260ccccc25009fe0dc12"} Oct 06 13:25:56 crc kubenswrapper[4867]: I1006 13:25:56.982429 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0","Type":"ContainerStarted","Data":"1aa3378ca29fca3e9cb11a176554c9f86c7ec9593dae701d7d4a404d3bf41003"} Oct 06 13:25:57 crc kubenswrapper[4867]: I1006 13:25:57.993614 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6669c79a-e288-4d00-8add-bffd6b33b8b9","Type":"ContainerStarted","Data":"2266be99d59a81e2a2d0cdb26f5f9fe31b480c8d0968140081f24ce00f8333f7"} Oct 06 13:25:59 crc kubenswrapper[4867]: I1006 13:25:59.569177 4867 scope.go:117] "RemoveContainer" containerID="a135d028414175f5944be7207f89535aa7d79d348a7d48149be7c1dc3feb0221" Oct 06 13:25:59 crc kubenswrapper[4867]: I1006 13:25:59.593366 4867 scope.go:117] "RemoveContainer" containerID="597a479a3cb886a14d2e7909f3c6b700f6a988b0c7e14aa3b60bbad10d6ed2a9" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.453537 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55d65d9c9f-h5hpn"] Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.459440 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.461496 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.464882 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d65d9c9f-h5hpn"] Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.551020 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvpsz\" (UniqueName: \"kubernetes.io/projected/a6e4881f-b7c5-41f4-be36-83c0cff916c4-kube-api-access-zvpsz\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.551078 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-dns-swift-storage-0\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.551125 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-dns-svc\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.551173 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-config\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.551299 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-ovsdbserver-sb\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.551429 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-openstack-edpm-ipam\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.551493 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-ovsdbserver-nb\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.653374 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvpsz\" (UniqueName: \"kubernetes.io/projected/a6e4881f-b7c5-41f4-be36-83c0cff916c4-kube-api-access-zvpsz\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.653429 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-dns-swift-storage-0\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.653473 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-dns-svc\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.653499 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-config\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.653543 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-ovsdbserver-sb\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.653580 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-openstack-edpm-ipam\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.653600 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-ovsdbserver-nb\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.654770 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-openstack-edpm-ipam\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.654905 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-dns-svc\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.654938 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-config\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.654905 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-ovsdbserver-sb\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.655300 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-ovsdbserver-nb\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.655525 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-dns-swift-storage-0\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.686957 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvpsz\" (UniqueName: \"kubernetes.io/projected/a6e4881f-b7c5-41f4-be36-83c0cff916c4-kube-api-access-zvpsz\") pod \"dnsmasq-dns-55d65d9c9f-h5hpn\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:03 crc kubenswrapper[4867]: I1006 13:26:03.789838 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:04 crc kubenswrapper[4867]: I1006 13:26:04.415667 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d65d9c9f-h5hpn"] Oct 06 13:26:05 crc kubenswrapper[4867]: I1006 13:26:05.088811 4867 generic.go:334] "Generic (PLEG): container finished" podID="a6e4881f-b7c5-41f4-be36-83c0cff916c4" containerID="c11abe622b4415533783bf8f03c8a48c02b27b4bcfb647742e3ded883eb72dec" exitCode=0 Oct 06 13:26:05 crc kubenswrapper[4867]: I1006 13:26:05.088926 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" event={"ID":"a6e4881f-b7c5-41f4-be36-83c0cff916c4","Type":"ContainerDied","Data":"c11abe622b4415533783bf8f03c8a48c02b27b4bcfb647742e3ded883eb72dec"} Oct 06 13:26:05 crc kubenswrapper[4867]: I1006 13:26:05.089099 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" event={"ID":"a6e4881f-b7c5-41f4-be36-83c0cff916c4","Type":"ContainerStarted","Data":"a64bfcf5f4d5d573fe864597a5ab46ba4aa332dd0706e5b15433c6060e5c9184"} Oct 06 13:26:06 crc kubenswrapper[4867]: I1006 13:26:06.098741 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" event={"ID":"a6e4881f-b7c5-41f4-be36-83c0cff916c4","Type":"ContainerStarted","Data":"333e09c2ad5754627d7a74404a74eba5790f6736944ba7fb1c8e7243b02e0321"} Oct 06 13:26:06 crc kubenswrapper[4867]: I1006 13:26:06.099209 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:06 crc kubenswrapper[4867]: I1006 13:26:06.125388 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" podStartSLOduration=3.125368776 podStartE2EDuration="3.125368776s" podCreationTimestamp="2025-10-06 13:26:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:26:06.116127135 +0000 UTC m=+1345.574075279" watchObservedRunningTime="2025-10-06 13:26:06.125368776 +0000 UTC m=+1345.583316920" Oct 06 13:26:13 crc kubenswrapper[4867]: I1006 13:26:13.792461 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:13 crc kubenswrapper[4867]: I1006 13:26:13.942811 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d7fff947c-95sph"] Oct 06 13:26:13 crc kubenswrapper[4867]: I1006 13:26:13.943585 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" podUID="bb55b4ec-8008-40ec-922c-15dab4b1dcd6" containerName="dnsmasq-dns" containerID="cri-o://9f51ebb4cae8e38fa5fcdebce7b297cf8834b65a7ed5e6f0ec7ede21e4db0b00" gracePeriod=10 Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.093385 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c48bdb645-wtbz6"] Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.096703 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.131138 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c48bdb645-wtbz6"] Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.172305 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-config\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.172609 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-dns-swift-storage-0\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.172708 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-ovsdbserver-nb\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.172986 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.173315 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-ovsdbserver-sb\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.173449 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-dns-svc\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.173506 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjl7r\" (UniqueName: \"kubernetes.io/projected/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-kube-api-access-kjl7r\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.193964 4867 generic.go:334] "Generic (PLEG): container finished" podID="bb55b4ec-8008-40ec-922c-15dab4b1dcd6" containerID="9f51ebb4cae8e38fa5fcdebce7b297cf8834b65a7ed5e6f0ec7ede21e4db0b00" exitCode=0 Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.194040 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" event={"ID":"bb55b4ec-8008-40ec-922c-15dab4b1dcd6","Type":"ContainerDied","Data":"9f51ebb4cae8e38fa5fcdebce7b297cf8834b65a7ed5e6f0ec7ede21e4db0b00"} Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.276642 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-ovsdbserver-sb\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.276778 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-dns-svc\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.276840 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjl7r\" (UniqueName: \"kubernetes.io/projected/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-kube-api-access-kjl7r\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.276949 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-config\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.277004 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-dns-swift-storage-0\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.277046 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-ovsdbserver-nb\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.277099 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.279552 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-ovsdbserver-sb\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.279570 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-dns-svc\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.279583 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-ovsdbserver-nb\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.279763 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.279859 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-config\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.279859 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-dns-swift-storage-0\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.316693 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjl7r\" (UniqueName: \"kubernetes.io/projected/662008a2-cb52-48d6-bd6e-7e1c9bd511cf-kube-api-access-kjl7r\") pod \"dnsmasq-dns-6c48bdb645-wtbz6\" (UID: \"662008a2-cb52-48d6-bd6e-7e1c9bd511cf\") " pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.471486 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.634764 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.789106 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-config\") pod \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.790526 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-dns-svc\") pod \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.790611 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbbhz\" (UniqueName: \"kubernetes.io/projected/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-kube-api-access-wbbhz\") pod \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.790699 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-ovsdbserver-sb\") pod \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.791491 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-ovsdbserver-nb\") pod \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.791604 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-dns-swift-storage-0\") pod \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\" (UID: \"bb55b4ec-8008-40ec-922c-15dab4b1dcd6\") " Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.798827 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-kube-api-access-wbbhz" (OuterVolumeSpecName: "kube-api-access-wbbhz") pod "bb55b4ec-8008-40ec-922c-15dab4b1dcd6" (UID: "bb55b4ec-8008-40ec-922c-15dab4b1dcd6"). InnerVolumeSpecName "kube-api-access-wbbhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.874104 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bb55b4ec-8008-40ec-922c-15dab4b1dcd6" (UID: "bb55b4ec-8008-40ec-922c-15dab4b1dcd6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.902675 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.902715 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbbhz\" (UniqueName: \"kubernetes.io/projected/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-kube-api-access-wbbhz\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.976241 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb55b4ec-8008-40ec-922c-15dab4b1dcd6" (UID: "bb55b4ec-8008-40ec-922c-15dab4b1dcd6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:26:14 crc kubenswrapper[4867]: I1006 13:26:14.976291 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb55b4ec-8008-40ec-922c-15dab4b1dcd6" (UID: "bb55b4ec-8008-40ec-922c-15dab4b1dcd6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:26:15 crc kubenswrapper[4867]: I1006 13:26:15.006848 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb55b4ec-8008-40ec-922c-15dab4b1dcd6" (UID: "bb55b4ec-8008-40ec-922c-15dab4b1dcd6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:26:15 crc kubenswrapper[4867]: I1006 13:26:15.009276 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:15 crc kubenswrapper[4867]: I1006 13:26:15.009310 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:15 crc kubenswrapper[4867]: I1006 13:26:15.009322 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:15 crc kubenswrapper[4867]: I1006 13:26:15.059136 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-config" (OuterVolumeSpecName: "config") pod "bb55b4ec-8008-40ec-922c-15dab4b1dcd6" (UID: "bb55b4ec-8008-40ec-922c-15dab4b1dcd6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:26:15 crc kubenswrapper[4867]: I1006 13:26:15.113013 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c48bdb645-wtbz6"] Oct 06 13:26:15 crc kubenswrapper[4867]: I1006 13:26:15.126224 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb55b4ec-8008-40ec-922c-15dab4b1dcd6-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:15 crc kubenswrapper[4867]: I1006 13:26:15.217657 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" event={"ID":"662008a2-cb52-48d6-bd6e-7e1c9bd511cf","Type":"ContainerStarted","Data":"54e8591d7121fb8e4f6302d6e32c5cd6b85853ee1027a681c0d48a95ac49b8fd"} Oct 06 13:26:15 crc kubenswrapper[4867]: I1006 13:26:15.228830 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:26:15 crc kubenswrapper[4867]: I1006 13:26:15.265062 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" event={"ID":"bb55b4ec-8008-40ec-922c-15dab4b1dcd6","Type":"ContainerDied","Data":"8a4a889426479d1f1d48b7f24ac54bef54f36168ef34b85bf913db4107f26ceb"} Oct 06 13:26:15 crc kubenswrapper[4867]: I1006 13:26:15.265649 4867 scope.go:117] "RemoveContainer" containerID="9f51ebb4cae8e38fa5fcdebce7b297cf8834b65a7ed5e6f0ec7ede21e4db0b00" Oct 06 13:26:15 crc kubenswrapper[4867]: I1006 13:26:15.368616 4867 scope.go:117] "RemoveContainer" containerID="b2ef4e326b88fe5ee81c9beba529dd727f3de468a2ff004a2a12ab3c09743f6a" Oct 06 13:26:16 crc kubenswrapper[4867]: I1006 13:26:16.242443 4867 generic.go:334] "Generic (PLEG): container finished" podID="662008a2-cb52-48d6-bd6e-7e1c9bd511cf" containerID="fc61880d80f44d5cabdd03af7a4681b326803524a69117e8d5b4dbdd8ac617c3" exitCode=0 Oct 06 13:26:16 crc kubenswrapper[4867]: I1006 13:26:16.242528 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" event={"ID":"662008a2-cb52-48d6-bd6e-7e1c9bd511cf","Type":"ContainerDied","Data":"fc61880d80f44d5cabdd03af7a4681b326803524a69117e8d5b4dbdd8ac617c3"} Oct 06 13:26:17 crc kubenswrapper[4867]: I1006 13:26:17.256887 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" event={"ID":"662008a2-cb52-48d6-bd6e-7e1c9bd511cf","Type":"ContainerStarted","Data":"95766bc722a7165dcfc0aa571591f37f4d7957fa29ff9e22a735627b42c82069"} Oct 06 13:26:17 crc kubenswrapper[4867]: I1006 13:26:17.258066 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:17 crc kubenswrapper[4867]: I1006 13:26:17.289815 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" podStartSLOduration=3.289765694 podStartE2EDuration="3.289765694s" podCreationTimestamp="2025-10-06 13:26:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:26:17.280387479 +0000 UTC m=+1356.738335663" watchObservedRunningTime="2025-10-06 13:26:17.289765694 +0000 UTC m=+1356.747713878" Oct 06 13:26:24 crc kubenswrapper[4867]: I1006 13:26:24.475514 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c48bdb645-wtbz6" Oct 06 13:26:24 crc kubenswrapper[4867]: I1006 13:26:24.565911 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55d65d9c9f-h5hpn"] Oct 06 13:26:24 crc kubenswrapper[4867]: I1006 13:26:24.566313 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" podUID="a6e4881f-b7c5-41f4-be36-83c0cff916c4" containerName="dnsmasq-dns" containerID="cri-o://333e09c2ad5754627d7a74404a74eba5790f6736944ba7fb1c8e7243b02e0321" gracePeriod=10 Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.215280 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.317658 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvpsz\" (UniqueName: \"kubernetes.io/projected/a6e4881f-b7c5-41f4-be36-83c0cff916c4-kube-api-access-zvpsz\") pod \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.317818 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-ovsdbserver-nb\") pod \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.317884 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-config\") pod \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.317960 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-openstack-edpm-ipam\") pod \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.318015 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-dns-swift-storage-0\") pod \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.318033 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-dns-svc\") pod \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.318229 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-ovsdbserver-sb\") pod \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\" (UID: \"a6e4881f-b7c5-41f4-be36-83c0cff916c4\") " Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.325913 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e4881f-b7c5-41f4-be36-83c0cff916c4-kube-api-access-zvpsz" (OuterVolumeSpecName: "kube-api-access-zvpsz") pod "a6e4881f-b7c5-41f4-be36-83c0cff916c4" (UID: "a6e4881f-b7c5-41f4-be36-83c0cff916c4"). InnerVolumeSpecName "kube-api-access-zvpsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.361796 4867 generic.go:334] "Generic (PLEG): container finished" podID="a6e4881f-b7c5-41f4-be36-83c0cff916c4" containerID="333e09c2ad5754627d7a74404a74eba5790f6736944ba7fb1c8e7243b02e0321" exitCode=0 Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.361855 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" event={"ID":"a6e4881f-b7c5-41f4-be36-83c0cff916c4","Type":"ContainerDied","Data":"333e09c2ad5754627d7a74404a74eba5790f6736944ba7fb1c8e7243b02e0321"} Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.361914 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" event={"ID":"a6e4881f-b7c5-41f4-be36-83c0cff916c4","Type":"ContainerDied","Data":"a64bfcf5f4d5d573fe864597a5ab46ba4aa332dd0706e5b15433c6060e5c9184"} Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.361909 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d65d9c9f-h5hpn" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.361960 4867 scope.go:117] "RemoveContainer" containerID="333e09c2ad5754627d7a74404a74eba5790f6736944ba7fb1c8e7243b02e0321" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.380764 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a6e4881f-b7c5-41f4-be36-83c0cff916c4" (UID: "a6e4881f-b7c5-41f4-be36-83c0cff916c4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.389827 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a6e4881f-b7c5-41f4-be36-83c0cff916c4" (UID: "a6e4881f-b7c5-41f4-be36-83c0cff916c4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.398673 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-config" (OuterVolumeSpecName: "config") pod "a6e4881f-b7c5-41f4-be36-83c0cff916c4" (UID: "a6e4881f-b7c5-41f4-be36-83c0cff916c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.405304 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6e4881f-b7c5-41f4-be36-83c0cff916c4" (UID: "a6e4881f-b7c5-41f4-be36-83c0cff916c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.410072 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a6e4881f-b7c5-41f4-be36-83c0cff916c4" (UID: "a6e4881f-b7c5-41f4-be36-83c0cff916c4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.411409 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a6e4881f-b7c5-41f4-be36-83c0cff916c4" (UID: "a6e4881f-b7c5-41f4-be36-83c0cff916c4"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.420581 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.420611 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvpsz\" (UniqueName: \"kubernetes.io/projected/a6e4881f-b7c5-41f4-be36-83c0cff916c4-kube-api-access-zvpsz\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.420622 4867 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.420633 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.420642 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.420650 4867 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.420659 4867 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6e4881f-b7c5-41f4-be36-83c0cff916c4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.490899 4867 scope.go:117] "RemoveContainer" containerID="c11abe622b4415533783bf8f03c8a48c02b27b4bcfb647742e3ded883eb72dec" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.522827 4867 scope.go:117] "RemoveContainer" containerID="333e09c2ad5754627d7a74404a74eba5790f6736944ba7fb1c8e7243b02e0321" Oct 06 13:26:25 crc kubenswrapper[4867]: E1006 13:26:25.523457 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"333e09c2ad5754627d7a74404a74eba5790f6736944ba7fb1c8e7243b02e0321\": container with ID starting with 333e09c2ad5754627d7a74404a74eba5790f6736944ba7fb1c8e7243b02e0321 not found: ID does not exist" containerID="333e09c2ad5754627d7a74404a74eba5790f6736944ba7fb1c8e7243b02e0321" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.523519 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333e09c2ad5754627d7a74404a74eba5790f6736944ba7fb1c8e7243b02e0321"} err="failed to get container status \"333e09c2ad5754627d7a74404a74eba5790f6736944ba7fb1c8e7243b02e0321\": rpc error: code = NotFound desc = could not find container \"333e09c2ad5754627d7a74404a74eba5790f6736944ba7fb1c8e7243b02e0321\": container with ID starting with 333e09c2ad5754627d7a74404a74eba5790f6736944ba7fb1c8e7243b02e0321 not found: ID does not exist" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.523560 4867 scope.go:117] "RemoveContainer" containerID="c11abe622b4415533783bf8f03c8a48c02b27b4bcfb647742e3ded883eb72dec" Oct 06 13:26:25 crc kubenswrapper[4867]: E1006 13:26:25.524475 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c11abe622b4415533783bf8f03c8a48c02b27b4bcfb647742e3ded883eb72dec\": container with ID starting with c11abe622b4415533783bf8f03c8a48c02b27b4bcfb647742e3ded883eb72dec not found: ID does not exist" containerID="c11abe622b4415533783bf8f03c8a48c02b27b4bcfb647742e3ded883eb72dec" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.524604 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c11abe622b4415533783bf8f03c8a48c02b27b4bcfb647742e3ded883eb72dec"} err="failed to get container status \"c11abe622b4415533783bf8f03c8a48c02b27b4bcfb647742e3ded883eb72dec\": rpc error: code = NotFound desc = could not find container \"c11abe622b4415533783bf8f03c8a48c02b27b4bcfb647742e3ded883eb72dec\": container with ID starting with c11abe622b4415533783bf8f03c8a48c02b27b4bcfb647742e3ded883eb72dec not found: ID does not exist" Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.705417 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55d65d9c9f-h5hpn"] Oct 06 13:26:25 crc kubenswrapper[4867]: I1006 13:26:25.714422 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55d65d9c9f-h5hpn"] Oct 06 13:26:27 crc kubenswrapper[4867]: I1006 13:26:27.245866 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e4881f-b7c5-41f4-be36-83c0cff916c4" path="/var/lib/kubelet/pods/a6e4881f-b7c5-41f4-be36-83c0cff916c4/volumes" Oct 06 13:26:29 crc kubenswrapper[4867]: I1006 13:26:29.415699 4867 generic.go:334] "Generic (PLEG): container finished" podID="5d0a4a4a-9d75-4d2b-aeb8-1903093398d0" containerID="1aa3378ca29fca3e9cb11a176554c9f86c7ec9593dae701d7d4a404d3bf41003" exitCode=0 Oct 06 13:26:29 crc kubenswrapper[4867]: I1006 13:26:29.415832 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0","Type":"ContainerDied","Data":"1aa3378ca29fca3e9cb11a176554c9f86c7ec9593dae701d7d4a404d3bf41003"} Oct 06 13:26:29 crc kubenswrapper[4867]: I1006 13:26:29.421165 4867 generic.go:334] "Generic (PLEG): container finished" podID="6669c79a-e288-4d00-8add-bffd6b33b8b9" containerID="2266be99d59a81e2a2d0cdb26f5f9fe31b480c8d0968140081f24ce00f8333f7" exitCode=0 Oct 06 13:26:29 crc kubenswrapper[4867]: I1006 13:26:29.421241 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6669c79a-e288-4d00-8add-bffd6b33b8b9","Type":"ContainerDied","Data":"2266be99d59a81e2a2d0cdb26f5f9fe31b480c8d0968140081f24ce00f8333f7"} Oct 06 13:26:30 crc kubenswrapper[4867]: I1006 13:26:30.432966 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d0a4a4a-9d75-4d2b-aeb8-1903093398d0","Type":"ContainerStarted","Data":"d17dd45553646eea5fd00503809492428ef08a2b052cafb368ea83b210d34877"} Oct 06 13:26:30 crc kubenswrapper[4867]: I1006 13:26:30.434698 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 06 13:26:30 crc kubenswrapper[4867]: I1006 13:26:30.434830 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6669c79a-e288-4d00-8add-bffd6b33b8b9","Type":"ContainerStarted","Data":"d8c2a1c9476359baf05cd02f2915d465f0581e0a214ec1976676759df4239159"} Oct 06 13:26:30 crc kubenswrapper[4867]: I1006 13:26:30.435036 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:26:30 crc kubenswrapper[4867]: I1006 13:26:30.482574 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.48255152 podStartE2EDuration="36.48255152s" podCreationTimestamp="2025-10-06 13:25:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:26:30.457424837 +0000 UTC m=+1369.915373001" watchObservedRunningTime="2025-10-06 13:26:30.48255152 +0000 UTC m=+1369.940499654" Oct 06 13:26:30 crc kubenswrapper[4867]: I1006 13:26:30.483422 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.483413493 podStartE2EDuration="35.483413493s" podCreationTimestamp="2025-10-06 13:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:26:30.474875591 +0000 UTC m=+1369.932823735" watchObservedRunningTime="2025-10-06 13:26:30.483413493 +0000 UTC m=+1369.941361637" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.707493 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw"] Oct 06 13:26:42 crc kubenswrapper[4867]: E1006 13:26:42.708812 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb55b4ec-8008-40ec-922c-15dab4b1dcd6" containerName="init" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.708827 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb55b4ec-8008-40ec-922c-15dab4b1dcd6" containerName="init" Oct 06 13:26:42 crc kubenswrapper[4867]: E1006 13:26:42.708842 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e4881f-b7c5-41f4-be36-83c0cff916c4" containerName="init" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.708848 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e4881f-b7c5-41f4-be36-83c0cff916c4" containerName="init" Oct 06 13:26:42 crc kubenswrapper[4867]: E1006 13:26:42.708857 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e4881f-b7c5-41f4-be36-83c0cff916c4" containerName="dnsmasq-dns" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.708863 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e4881f-b7c5-41f4-be36-83c0cff916c4" containerName="dnsmasq-dns" Oct 06 13:26:42 crc kubenswrapper[4867]: E1006 13:26:42.708877 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb55b4ec-8008-40ec-922c-15dab4b1dcd6" containerName="dnsmasq-dns" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.708882 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb55b4ec-8008-40ec-922c-15dab4b1dcd6" containerName="dnsmasq-dns" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.709078 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e4881f-b7c5-41f4-be36-83c0cff916c4" containerName="dnsmasq-dns" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.709091 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb55b4ec-8008-40ec-922c-15dab4b1dcd6" containerName="dnsmasq-dns" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.709853 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.711676 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.712937 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.716017 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.716219 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.725896 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw"] Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.775121 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.775177 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv6lj\" (UniqueName: \"kubernetes.io/projected/8f86cabb-0582-4b1c-993f-f9766defe823-kube-api-access-wv6lj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.775337 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.775528 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.877177 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.877325 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.877355 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv6lj\" (UniqueName: \"kubernetes.io/projected/8f86cabb-0582-4b1c-993f-f9766defe823-kube-api-access-wv6lj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.877416 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.884471 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.886903 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.892614 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:26:42 crc kubenswrapper[4867]: I1006 13:26:42.896079 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv6lj\" (UniqueName: \"kubernetes.io/projected/8f86cabb-0582-4b1c-993f-f9766defe823-kube-api-access-wv6lj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:26:43 crc kubenswrapper[4867]: I1006 13:26:43.050014 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:26:43 crc kubenswrapper[4867]: I1006 13:26:43.739299 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw"] Oct 06 13:26:43 crc kubenswrapper[4867]: W1006 13:26:43.761885 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f86cabb_0582_4b1c_993f_f9766defe823.slice/crio-8468a7a0ce9bf6353b08439007db152ac13043f7770ed39fc41887d2dd621d9a WatchSource:0}: Error finding container 8468a7a0ce9bf6353b08439007db152ac13043f7770ed39fc41887d2dd621d9a: Status 404 returned error can't find the container with id 8468a7a0ce9bf6353b08439007db152ac13043f7770ed39fc41887d2dd621d9a Oct 06 13:26:44 crc kubenswrapper[4867]: I1006 13:26:44.594809 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" event={"ID":"8f86cabb-0582-4b1c-993f-f9766defe823","Type":"ContainerStarted","Data":"8468a7a0ce9bf6353b08439007db152ac13043f7770ed39fc41887d2dd621d9a"} Oct 06 13:26:44 crc kubenswrapper[4867]: I1006 13:26:44.830628 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 06 13:26:45 crc kubenswrapper[4867]: I1006 13:26:45.356454 4867 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podbb55b4ec-8008-40ec-922c-15dab4b1dcd6"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podbb55b4ec-8008-40ec-922c-15dab4b1dcd6] : Timed out while waiting for systemd to remove kubepods-besteffort-podbb55b4ec_8008_40ec_922c_15dab4b1dcd6.slice" Oct 06 13:26:45 crc kubenswrapper[4867]: E1006 13:26:45.356506 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podbb55b4ec-8008-40ec-922c-15dab4b1dcd6] : unable to destroy cgroup paths for cgroup [kubepods besteffort podbb55b4ec-8008-40ec-922c-15dab4b1dcd6] : Timed out while waiting for systemd to remove kubepods-besteffort-podbb55b4ec_8008_40ec_922c_15dab4b1dcd6.slice" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" podUID="bb55b4ec-8008-40ec-922c-15dab4b1dcd6" Oct 06 13:26:45 crc kubenswrapper[4867]: I1006 13:26:45.408493 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 06 13:26:45 crc kubenswrapper[4867]: I1006 13:26:45.606714 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d7fff947c-95sph" Oct 06 13:26:45 crc kubenswrapper[4867]: I1006 13:26:45.645673 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d7fff947c-95sph"] Oct 06 13:26:45 crc kubenswrapper[4867]: I1006 13:26:45.663724 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d7fff947c-95sph"] Oct 06 13:26:47 crc kubenswrapper[4867]: I1006 13:26:47.237208 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb55b4ec-8008-40ec-922c-15dab4b1dcd6" path="/var/lib/kubelet/pods/bb55b4ec-8008-40ec-922c-15dab4b1dcd6/volumes" Oct 06 13:26:53 crc kubenswrapper[4867]: I1006 13:26:53.703458 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" event={"ID":"8f86cabb-0582-4b1c-993f-f9766defe823","Type":"ContainerStarted","Data":"e07d6d5d4a540669c3d3f470faad78337073056f38a99ef5ed5504f34025e1c6"} Oct 06 13:26:53 crc kubenswrapper[4867]: I1006 13:26:53.724713 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" podStartSLOduration=2.134582224 podStartE2EDuration="11.724691804s" podCreationTimestamp="2025-10-06 13:26:42 +0000 UTC" firstStartedPulling="2025-10-06 13:26:43.765207284 +0000 UTC m=+1383.223155428" lastFinishedPulling="2025-10-06 13:26:53.355316864 +0000 UTC m=+1392.813265008" observedRunningTime="2025-10-06 13:26:53.71681322 +0000 UTC m=+1393.174761374" watchObservedRunningTime="2025-10-06 13:26:53.724691804 +0000 UTC m=+1393.182639948" Oct 06 13:26:59 crc kubenswrapper[4867]: I1006 13:26:59.757750 4867 scope.go:117] "RemoveContainer" containerID="ad47d6a4f7a6e3e30fb468a8e02c5f75a0cb02c3c527a4d939f3c3408897b0af" Oct 06 13:27:04 crc kubenswrapper[4867]: I1006 13:27:04.820979 4867 generic.go:334] "Generic (PLEG): container finished" podID="8f86cabb-0582-4b1c-993f-f9766defe823" containerID="e07d6d5d4a540669c3d3f470faad78337073056f38a99ef5ed5504f34025e1c6" exitCode=0 Oct 06 13:27:04 crc kubenswrapper[4867]: I1006 13:27:04.821402 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" event={"ID":"8f86cabb-0582-4b1c-993f-f9766defe823","Type":"ContainerDied","Data":"e07d6d5d4a540669c3d3f470faad78337073056f38a99ef5ed5504f34025e1c6"} Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.306894 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.451421 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-inventory\") pod \"8f86cabb-0582-4b1c-993f-f9766defe823\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.451526 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-ssh-key\") pod \"8f86cabb-0582-4b1c-993f-f9766defe823\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.451724 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-repo-setup-combined-ca-bundle\") pod \"8f86cabb-0582-4b1c-993f-f9766defe823\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.451807 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv6lj\" (UniqueName: \"kubernetes.io/projected/8f86cabb-0582-4b1c-993f-f9766defe823-kube-api-access-wv6lj\") pod \"8f86cabb-0582-4b1c-993f-f9766defe823\" (UID: \"8f86cabb-0582-4b1c-993f-f9766defe823\") " Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.458214 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f86cabb-0582-4b1c-993f-f9766defe823-kube-api-access-wv6lj" (OuterVolumeSpecName: "kube-api-access-wv6lj") pod "8f86cabb-0582-4b1c-993f-f9766defe823" (UID: "8f86cabb-0582-4b1c-993f-f9766defe823"). InnerVolumeSpecName "kube-api-access-wv6lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.459235 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8f86cabb-0582-4b1c-993f-f9766defe823" (UID: "8f86cabb-0582-4b1c-993f-f9766defe823"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.480752 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8f86cabb-0582-4b1c-993f-f9766defe823" (UID: "8f86cabb-0582-4b1c-993f-f9766defe823"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.480909 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-inventory" (OuterVolumeSpecName: "inventory") pod "8f86cabb-0582-4b1c-993f-f9766defe823" (UID: "8f86cabb-0582-4b1c-993f-f9766defe823"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.553875 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv6lj\" (UniqueName: \"kubernetes.io/projected/8f86cabb-0582-4b1c-993f-f9766defe823-kube-api-access-wv6lj\") on node \"crc\" DevicePath \"\"" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.553916 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.553928 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.553940 4867 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f86cabb-0582-4b1c-993f-f9766defe823-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.852076 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" event={"ID":"8f86cabb-0582-4b1c-993f-f9766defe823","Type":"ContainerDied","Data":"8468a7a0ce9bf6353b08439007db152ac13043f7770ed39fc41887d2dd621d9a"} Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.852354 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8468a7a0ce9bf6353b08439007db152ac13043f7770ed39fc41887d2dd621d9a" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.852118 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.933725 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6"] Oct 06 13:27:06 crc kubenswrapper[4867]: E1006 13:27:06.934224 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f86cabb-0582-4b1c-993f-f9766defe823" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.934267 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f86cabb-0582-4b1c-993f-f9766defe823" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.934529 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f86cabb-0582-4b1c-993f-f9766defe823" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.935414 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.938084 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.938275 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.938308 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.942580 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:27:06 crc kubenswrapper[4867]: I1006 13:27:06.946659 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6"] Oct 06 13:27:07 crc kubenswrapper[4867]: I1006 13:27:07.063874 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05b64a8a-2fa5-4281-8e82-c27ff976b24f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p9xg6\" (UID: \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" Oct 06 13:27:07 crc kubenswrapper[4867]: I1006 13:27:07.064070 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05b64a8a-2fa5-4281-8e82-c27ff976b24f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p9xg6\" (UID: \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" Oct 06 13:27:07 crc kubenswrapper[4867]: I1006 13:27:07.064174 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2tl7\" (UniqueName: \"kubernetes.io/projected/05b64a8a-2fa5-4281-8e82-c27ff976b24f-kube-api-access-w2tl7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p9xg6\" (UID: \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" Oct 06 13:27:07 crc kubenswrapper[4867]: I1006 13:27:07.165912 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05b64a8a-2fa5-4281-8e82-c27ff976b24f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p9xg6\" (UID: \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" Oct 06 13:27:07 crc kubenswrapper[4867]: I1006 13:27:07.165969 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2tl7\" (UniqueName: \"kubernetes.io/projected/05b64a8a-2fa5-4281-8e82-c27ff976b24f-kube-api-access-w2tl7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p9xg6\" (UID: \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" Oct 06 13:27:07 crc kubenswrapper[4867]: I1006 13:27:07.166167 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05b64a8a-2fa5-4281-8e82-c27ff976b24f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p9xg6\" (UID: \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" Oct 06 13:27:07 crc kubenswrapper[4867]: I1006 13:27:07.181832 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05b64a8a-2fa5-4281-8e82-c27ff976b24f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p9xg6\" (UID: \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" Oct 06 13:27:07 crc kubenswrapper[4867]: I1006 13:27:07.183074 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05b64a8a-2fa5-4281-8e82-c27ff976b24f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p9xg6\" (UID: \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" Oct 06 13:27:07 crc kubenswrapper[4867]: I1006 13:27:07.183752 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2tl7\" (UniqueName: \"kubernetes.io/projected/05b64a8a-2fa5-4281-8e82-c27ff976b24f-kube-api-access-w2tl7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p9xg6\" (UID: \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" Oct 06 13:27:07 crc kubenswrapper[4867]: I1006 13:27:07.259707 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" Oct 06 13:27:07 crc kubenswrapper[4867]: I1006 13:27:07.755945 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6"] Oct 06 13:27:07 crc kubenswrapper[4867]: I1006 13:27:07.880750 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" event={"ID":"05b64a8a-2fa5-4281-8e82-c27ff976b24f","Type":"ContainerStarted","Data":"9d2f212d9d6b64078d966f18a5d65858a3d0373968a501b39c35e47a41d56284"} Oct 06 13:27:08 crc kubenswrapper[4867]: I1006 13:27:08.891211 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" event={"ID":"05b64a8a-2fa5-4281-8e82-c27ff976b24f","Type":"ContainerStarted","Data":"55a2a2280e081bc332ed7a86c3943b47e4946ae13b9241f8dd1b946e432d6a4f"} Oct 06 13:27:11 crc kubenswrapper[4867]: I1006 13:27:11.919613 4867 generic.go:334] "Generic (PLEG): container finished" podID="05b64a8a-2fa5-4281-8e82-c27ff976b24f" containerID="55a2a2280e081bc332ed7a86c3943b47e4946ae13b9241f8dd1b946e432d6a4f" exitCode=0 Oct 06 13:27:11 crc kubenswrapper[4867]: I1006 13:27:11.919693 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" event={"ID":"05b64a8a-2fa5-4281-8e82-c27ff976b24f","Type":"ContainerDied","Data":"55a2a2280e081bc332ed7a86c3943b47e4946ae13b9241f8dd1b946e432d6a4f"} Oct 06 13:27:12 crc kubenswrapper[4867]: I1006 13:27:12.873166 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:27:12 crc kubenswrapper[4867]: I1006 13:27:12.873567 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:27:13 crc kubenswrapper[4867]: I1006 13:27:13.359086 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" Oct 06 13:27:13 crc kubenswrapper[4867]: I1006 13:27:13.485540 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05b64a8a-2fa5-4281-8e82-c27ff976b24f-inventory\") pod \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\" (UID: \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\") " Oct 06 13:27:13 crc kubenswrapper[4867]: I1006 13:27:13.485722 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05b64a8a-2fa5-4281-8e82-c27ff976b24f-ssh-key\") pod \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\" (UID: \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\") " Oct 06 13:27:13 crc kubenswrapper[4867]: I1006 13:27:13.485759 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2tl7\" (UniqueName: \"kubernetes.io/projected/05b64a8a-2fa5-4281-8e82-c27ff976b24f-kube-api-access-w2tl7\") pod \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\" (UID: \"05b64a8a-2fa5-4281-8e82-c27ff976b24f\") " Oct 06 13:27:13 crc kubenswrapper[4867]: I1006 13:27:13.492518 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05b64a8a-2fa5-4281-8e82-c27ff976b24f-kube-api-access-w2tl7" (OuterVolumeSpecName: "kube-api-access-w2tl7") pod "05b64a8a-2fa5-4281-8e82-c27ff976b24f" (UID: "05b64a8a-2fa5-4281-8e82-c27ff976b24f"). InnerVolumeSpecName "kube-api-access-w2tl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:27:13 crc kubenswrapper[4867]: I1006 13:27:13.518542 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05b64a8a-2fa5-4281-8e82-c27ff976b24f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "05b64a8a-2fa5-4281-8e82-c27ff976b24f" (UID: "05b64a8a-2fa5-4281-8e82-c27ff976b24f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:27:13 crc kubenswrapper[4867]: I1006 13:27:13.520701 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05b64a8a-2fa5-4281-8e82-c27ff976b24f-inventory" (OuterVolumeSpecName: "inventory") pod "05b64a8a-2fa5-4281-8e82-c27ff976b24f" (UID: "05b64a8a-2fa5-4281-8e82-c27ff976b24f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:27:13 crc kubenswrapper[4867]: I1006 13:27:13.588086 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/05b64a8a-2fa5-4281-8e82-c27ff976b24f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:27:13 crc kubenswrapper[4867]: I1006 13:27:13.588124 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2tl7\" (UniqueName: \"kubernetes.io/projected/05b64a8a-2fa5-4281-8e82-c27ff976b24f-kube-api-access-w2tl7\") on node \"crc\" DevicePath \"\"" Oct 06 13:27:13 crc kubenswrapper[4867]: I1006 13:27:13.588136 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05b64a8a-2fa5-4281-8e82-c27ff976b24f-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:27:13 crc kubenswrapper[4867]: I1006 13:27:13.940902 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" event={"ID":"05b64a8a-2fa5-4281-8e82-c27ff976b24f","Type":"ContainerDied","Data":"9d2f212d9d6b64078d966f18a5d65858a3d0373968a501b39c35e47a41d56284"} Oct 06 13:27:13 crc kubenswrapper[4867]: I1006 13:27:13.940967 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d2f212d9d6b64078d966f18a5d65858a3d0373968a501b39c35e47a41d56284" Oct 06 13:27:13 crc kubenswrapper[4867]: I1006 13:27:13.940967 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p9xg6" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.024183 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9"] Oct 06 13:27:14 crc kubenswrapper[4867]: E1006 13:27:14.024607 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b64a8a-2fa5-4281-8e82-c27ff976b24f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.024625 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b64a8a-2fa5-4281-8e82-c27ff976b24f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.024834 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="05b64a8a-2fa5-4281-8e82-c27ff976b24f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.025862 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.027946 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.028239 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.028380 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.028822 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.040583 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9"] Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.096827 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.096916 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.097068 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvv6\" (UniqueName: \"kubernetes.io/projected/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-kube-api-access-wkvv6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.097118 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.199079 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvv6\" (UniqueName: \"kubernetes.io/projected/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-kube-api-access-wkvv6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.199457 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.199797 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.199935 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.203709 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.203744 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.205330 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.223742 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvv6\" (UniqueName: \"kubernetes.io/projected/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-kube-api-access-wkvv6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.381810 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.913512 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9"] Oct 06 13:27:14 crc kubenswrapper[4867]: I1006 13:27:14.950839 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" event={"ID":"e7ba5c1b-0dcb-4509-bb81-4bda347944bf","Type":"ContainerStarted","Data":"93d5cbed38e51e3858fcb76e93915988df62fb9d504c49cdcaaf6ffabc75b962"} Oct 06 13:27:15 crc kubenswrapper[4867]: I1006 13:27:15.962963 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" event={"ID":"e7ba5c1b-0dcb-4509-bb81-4bda347944bf","Type":"ContainerStarted","Data":"ce81db6380b3c441fcd17dda3af6d80e3a190f49f5e40efc6cc8d443dd9f1492"} Oct 06 13:27:15 crc kubenswrapper[4867]: I1006 13:27:15.985707 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" podStartSLOduration=1.552610077 podStartE2EDuration="1.985689816s" podCreationTimestamp="2025-10-06 13:27:14 +0000 UTC" firstStartedPulling="2025-10-06 13:27:14.924359218 +0000 UTC m=+1414.382307352" lastFinishedPulling="2025-10-06 13:27:15.357438947 +0000 UTC m=+1414.815387091" observedRunningTime="2025-10-06 13:27:15.97734449 +0000 UTC m=+1415.435292634" watchObservedRunningTime="2025-10-06 13:27:15.985689816 +0000 UTC m=+1415.443637960" Oct 06 13:27:37 crc kubenswrapper[4867]: I1006 13:27:37.901629 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ljx49"] Oct 06 13:27:37 crc kubenswrapper[4867]: I1006 13:27:37.904366 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:37 crc kubenswrapper[4867]: I1006 13:27:37.923672 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ljx49"] Oct 06 13:27:37 crc kubenswrapper[4867]: I1006 13:27:37.987409 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feaa649b-32b9-4f3b-afb4-00627c6b4f75-catalog-content\") pod \"certified-operators-ljx49\" (UID: \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\") " pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:37 crc kubenswrapper[4867]: I1006 13:27:37.987591 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feaa649b-32b9-4f3b-afb4-00627c6b4f75-utilities\") pod \"certified-operators-ljx49\" (UID: \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\") " pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:37 crc kubenswrapper[4867]: I1006 13:27:37.987644 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2khcl\" (UniqueName: \"kubernetes.io/projected/feaa649b-32b9-4f3b-afb4-00627c6b4f75-kube-api-access-2khcl\") pod \"certified-operators-ljx49\" (UID: \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\") " pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:38 crc kubenswrapper[4867]: I1006 13:27:38.089940 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feaa649b-32b9-4f3b-afb4-00627c6b4f75-catalog-content\") pod \"certified-operators-ljx49\" (UID: \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\") " pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:38 crc kubenswrapper[4867]: I1006 13:27:38.090412 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feaa649b-32b9-4f3b-afb4-00627c6b4f75-utilities\") pod \"certified-operators-ljx49\" (UID: \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\") " pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:38 crc kubenswrapper[4867]: I1006 13:27:38.090544 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2khcl\" (UniqueName: \"kubernetes.io/projected/feaa649b-32b9-4f3b-afb4-00627c6b4f75-kube-api-access-2khcl\") pod \"certified-operators-ljx49\" (UID: \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\") " pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:38 crc kubenswrapper[4867]: I1006 13:27:38.090978 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feaa649b-32b9-4f3b-afb4-00627c6b4f75-catalog-content\") pod \"certified-operators-ljx49\" (UID: \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\") " pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:38 crc kubenswrapper[4867]: I1006 13:27:38.091285 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feaa649b-32b9-4f3b-afb4-00627c6b4f75-utilities\") pod \"certified-operators-ljx49\" (UID: \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\") " pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:38 crc kubenswrapper[4867]: I1006 13:27:38.114979 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2khcl\" (UniqueName: \"kubernetes.io/projected/feaa649b-32b9-4f3b-afb4-00627c6b4f75-kube-api-access-2khcl\") pod \"certified-operators-ljx49\" (UID: \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\") " pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:38 crc kubenswrapper[4867]: I1006 13:27:38.279295 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:38 crc kubenswrapper[4867]: I1006 13:27:38.774369 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ljx49"] Oct 06 13:27:39 crc kubenswrapper[4867]: I1006 13:27:39.198812 4867 generic.go:334] "Generic (PLEG): container finished" podID="feaa649b-32b9-4f3b-afb4-00627c6b4f75" containerID="3e756a045776146d767eb39b4c327518d442413a48a6ceb79c638b80549ff337" exitCode=0 Oct 06 13:27:39 crc kubenswrapper[4867]: I1006 13:27:39.199346 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljx49" event={"ID":"feaa649b-32b9-4f3b-afb4-00627c6b4f75","Type":"ContainerDied","Data":"3e756a045776146d767eb39b4c327518d442413a48a6ceb79c638b80549ff337"} Oct 06 13:27:39 crc kubenswrapper[4867]: I1006 13:27:39.199375 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljx49" event={"ID":"feaa649b-32b9-4f3b-afb4-00627c6b4f75","Type":"ContainerStarted","Data":"e37a7c27c20e2c7b2952b9f0ea960437200250c87395b34bf8e4969e363369a4"} Oct 06 13:27:40 crc kubenswrapper[4867]: I1006 13:27:40.210962 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljx49" event={"ID":"feaa649b-32b9-4f3b-afb4-00627c6b4f75","Type":"ContainerStarted","Data":"f02e7b64229648f48713aa1bda0353153823506f33f7336bc4debb77046a4353"} Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.225312 4867 generic.go:334] "Generic (PLEG): container finished" podID="feaa649b-32b9-4f3b-afb4-00627c6b4f75" containerID="f02e7b64229648f48713aa1bda0353153823506f33f7336bc4debb77046a4353" exitCode=0 Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.231956 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljx49" event={"ID":"feaa649b-32b9-4f3b-afb4-00627c6b4f75","Type":"ContainerDied","Data":"f02e7b64229648f48713aa1bda0353153823506f33f7336bc4debb77046a4353"} Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.279178 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cg2p6"] Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.282007 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.301991 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cg2p6"] Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.374928 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4dc7907-3f4f-492e-8916-0e4fbeb11148-catalog-content\") pod \"community-operators-cg2p6\" (UID: \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\") " pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.375381 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjqtl\" (UniqueName: \"kubernetes.io/projected/c4dc7907-3f4f-492e-8916-0e4fbeb11148-kube-api-access-rjqtl\") pod \"community-operators-cg2p6\" (UID: \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\") " pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.375441 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4dc7907-3f4f-492e-8916-0e4fbeb11148-utilities\") pod \"community-operators-cg2p6\" (UID: \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\") " pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.477717 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4dc7907-3f4f-492e-8916-0e4fbeb11148-catalog-content\") pod \"community-operators-cg2p6\" (UID: \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\") " pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.477914 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjqtl\" (UniqueName: \"kubernetes.io/projected/c4dc7907-3f4f-492e-8916-0e4fbeb11148-kube-api-access-rjqtl\") pod \"community-operators-cg2p6\" (UID: \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\") " pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.477945 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4dc7907-3f4f-492e-8916-0e4fbeb11148-utilities\") pod \"community-operators-cg2p6\" (UID: \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\") " pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.478538 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4dc7907-3f4f-492e-8916-0e4fbeb11148-catalog-content\") pod \"community-operators-cg2p6\" (UID: \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\") " pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.478588 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4dc7907-3f4f-492e-8916-0e4fbeb11148-utilities\") pod \"community-operators-cg2p6\" (UID: \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\") " pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.507158 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjqtl\" (UniqueName: \"kubernetes.io/projected/c4dc7907-3f4f-492e-8916-0e4fbeb11148-kube-api-access-rjqtl\") pod \"community-operators-cg2p6\" (UID: \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\") " pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:41 crc kubenswrapper[4867]: I1006 13:27:41.603033 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:42 crc kubenswrapper[4867]: I1006 13:27:42.163740 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cg2p6"] Oct 06 13:27:42 crc kubenswrapper[4867]: W1006 13:27:42.169964 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4dc7907_3f4f_492e_8916_0e4fbeb11148.slice/crio-bd77c9bfab5ed1a8e2d7c137420934c4914f89bf861632d9ce980ac1a232aeb7 WatchSource:0}: Error finding container bd77c9bfab5ed1a8e2d7c137420934c4914f89bf861632d9ce980ac1a232aeb7: Status 404 returned error can't find the container with id bd77c9bfab5ed1a8e2d7c137420934c4914f89bf861632d9ce980ac1a232aeb7 Oct 06 13:27:42 crc kubenswrapper[4867]: I1006 13:27:42.235369 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cg2p6" event={"ID":"c4dc7907-3f4f-492e-8916-0e4fbeb11148","Type":"ContainerStarted","Data":"bd77c9bfab5ed1a8e2d7c137420934c4914f89bf861632d9ce980ac1a232aeb7"} Oct 06 13:27:42 crc kubenswrapper[4867]: I1006 13:27:42.237952 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljx49" event={"ID":"feaa649b-32b9-4f3b-afb4-00627c6b4f75","Type":"ContainerStarted","Data":"d491e79699e9554560013d7b72655c67cb6101801fc6f13707311b96861ccdda"} Oct 06 13:27:42 crc kubenswrapper[4867]: I1006 13:27:42.257742 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ljx49" podStartSLOduration=2.763162356 podStartE2EDuration="5.257727382s" podCreationTimestamp="2025-10-06 13:27:37 +0000 UTC" firstStartedPulling="2025-10-06 13:27:39.20088469 +0000 UTC m=+1438.658832834" lastFinishedPulling="2025-10-06 13:27:41.695449706 +0000 UTC m=+1441.153397860" observedRunningTime="2025-10-06 13:27:42.256512999 +0000 UTC m=+1441.714461143" watchObservedRunningTime="2025-10-06 13:27:42.257727382 +0000 UTC m=+1441.715675526" Oct 06 13:27:42 crc kubenswrapper[4867]: I1006 13:27:42.873871 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:27:42 crc kubenswrapper[4867]: I1006 13:27:42.874287 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:27:43 crc kubenswrapper[4867]: I1006 13:27:43.248507 4867 generic.go:334] "Generic (PLEG): container finished" podID="c4dc7907-3f4f-492e-8916-0e4fbeb11148" containerID="cb2b05ec41a98e798cc3b3ac7a00f3096e165c6da327450092a7ad59c654ac68" exitCode=0 Oct 06 13:27:43 crc kubenswrapper[4867]: I1006 13:27:43.248559 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cg2p6" event={"ID":"c4dc7907-3f4f-492e-8916-0e4fbeb11148","Type":"ContainerDied","Data":"cb2b05ec41a98e798cc3b3ac7a00f3096e165c6da327450092a7ad59c654ac68"} Oct 06 13:27:44 crc kubenswrapper[4867]: I1006 13:27:44.264528 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cg2p6" event={"ID":"c4dc7907-3f4f-492e-8916-0e4fbeb11148","Type":"ContainerStarted","Data":"528b0e239d9d5d2f49040633b900f4492f1404b48ec8b489f88315986e860195"} Oct 06 13:27:45 crc kubenswrapper[4867]: I1006 13:27:45.277685 4867 generic.go:334] "Generic (PLEG): container finished" podID="c4dc7907-3f4f-492e-8916-0e4fbeb11148" containerID="528b0e239d9d5d2f49040633b900f4492f1404b48ec8b489f88315986e860195" exitCode=0 Oct 06 13:27:45 crc kubenswrapper[4867]: I1006 13:27:45.277778 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cg2p6" event={"ID":"c4dc7907-3f4f-492e-8916-0e4fbeb11148","Type":"ContainerDied","Data":"528b0e239d9d5d2f49040633b900f4492f1404b48ec8b489f88315986e860195"} Oct 06 13:27:46 crc kubenswrapper[4867]: I1006 13:27:46.290749 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cg2p6" event={"ID":"c4dc7907-3f4f-492e-8916-0e4fbeb11148","Type":"ContainerStarted","Data":"4c448b1aaaea03f3b097f74e24f08d9c5476e596c48a6cc6e7600dc6fea43c9c"} Oct 06 13:27:46 crc kubenswrapper[4867]: I1006 13:27:46.316485 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cg2p6" podStartSLOduration=2.691201534 podStartE2EDuration="5.316460298s" podCreationTimestamp="2025-10-06 13:27:41 +0000 UTC" firstStartedPulling="2025-10-06 13:27:43.250461417 +0000 UTC m=+1442.708409561" lastFinishedPulling="2025-10-06 13:27:45.875720181 +0000 UTC m=+1445.333668325" observedRunningTime="2025-10-06 13:27:46.307308829 +0000 UTC m=+1445.765256993" watchObservedRunningTime="2025-10-06 13:27:46.316460298 +0000 UTC m=+1445.774408452" Oct 06 13:27:48 crc kubenswrapper[4867]: I1006 13:27:48.279453 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:48 crc kubenswrapper[4867]: I1006 13:27:48.279842 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:48 crc kubenswrapper[4867]: I1006 13:27:48.333855 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:48 crc kubenswrapper[4867]: I1006 13:27:48.396388 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:49 crc kubenswrapper[4867]: I1006 13:27:49.488468 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ljx49"] Oct 06 13:27:50 crc kubenswrapper[4867]: I1006 13:27:50.342332 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ljx49" podUID="feaa649b-32b9-4f3b-afb4-00627c6b4f75" containerName="registry-server" containerID="cri-o://d491e79699e9554560013d7b72655c67cb6101801fc6f13707311b96861ccdda" gracePeriod=2 Oct 06 13:27:50 crc kubenswrapper[4867]: I1006 13:27:50.805778 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:50 crc kubenswrapper[4867]: I1006 13:27:50.825531 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2khcl\" (UniqueName: \"kubernetes.io/projected/feaa649b-32b9-4f3b-afb4-00627c6b4f75-kube-api-access-2khcl\") pod \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\" (UID: \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\") " Oct 06 13:27:50 crc kubenswrapper[4867]: I1006 13:27:50.826524 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feaa649b-32b9-4f3b-afb4-00627c6b4f75-utilities\") pod \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\" (UID: \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\") " Oct 06 13:27:50 crc kubenswrapper[4867]: I1006 13:27:50.826706 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feaa649b-32b9-4f3b-afb4-00627c6b4f75-catalog-content\") pod \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\" (UID: \"feaa649b-32b9-4f3b-afb4-00627c6b4f75\") " Oct 06 13:27:50 crc kubenswrapper[4867]: I1006 13:27:50.827246 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feaa649b-32b9-4f3b-afb4-00627c6b4f75-utilities" (OuterVolumeSpecName: "utilities") pod "feaa649b-32b9-4f3b-afb4-00627c6b4f75" (UID: "feaa649b-32b9-4f3b-afb4-00627c6b4f75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:27:50 crc kubenswrapper[4867]: I1006 13:27:50.827515 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feaa649b-32b9-4f3b-afb4-00627c6b4f75-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:27:50 crc kubenswrapper[4867]: I1006 13:27:50.849276 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feaa649b-32b9-4f3b-afb4-00627c6b4f75-kube-api-access-2khcl" (OuterVolumeSpecName: "kube-api-access-2khcl") pod "feaa649b-32b9-4f3b-afb4-00627c6b4f75" (UID: "feaa649b-32b9-4f3b-afb4-00627c6b4f75"). InnerVolumeSpecName "kube-api-access-2khcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:27:50 crc kubenswrapper[4867]: I1006 13:27:50.875401 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feaa649b-32b9-4f3b-afb4-00627c6b4f75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "feaa649b-32b9-4f3b-afb4-00627c6b4f75" (UID: "feaa649b-32b9-4f3b-afb4-00627c6b4f75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:27:50 crc kubenswrapper[4867]: I1006 13:27:50.928661 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feaa649b-32b9-4f3b-afb4-00627c6b4f75-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:27:50 crc kubenswrapper[4867]: I1006 13:27:50.928694 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2khcl\" (UniqueName: \"kubernetes.io/projected/feaa649b-32b9-4f3b-afb4-00627c6b4f75-kube-api-access-2khcl\") on node \"crc\" DevicePath \"\"" Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.358039 4867 generic.go:334] "Generic (PLEG): container finished" podID="feaa649b-32b9-4f3b-afb4-00627c6b4f75" containerID="d491e79699e9554560013d7b72655c67cb6101801fc6f13707311b96861ccdda" exitCode=0 Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.358085 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljx49" event={"ID":"feaa649b-32b9-4f3b-afb4-00627c6b4f75","Type":"ContainerDied","Data":"d491e79699e9554560013d7b72655c67cb6101801fc6f13707311b96861ccdda"} Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.358116 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ljx49" event={"ID":"feaa649b-32b9-4f3b-afb4-00627c6b4f75","Type":"ContainerDied","Data":"e37a7c27c20e2c7b2952b9f0ea960437200250c87395b34bf8e4969e363369a4"} Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.358135 4867 scope.go:117] "RemoveContainer" containerID="d491e79699e9554560013d7b72655c67cb6101801fc6f13707311b96861ccdda" Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.358208 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ljx49" Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.386768 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ljx49"] Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.394636 4867 scope.go:117] "RemoveContainer" containerID="f02e7b64229648f48713aa1bda0353153823506f33f7336bc4debb77046a4353" Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.398576 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ljx49"] Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.433641 4867 scope.go:117] "RemoveContainer" containerID="3e756a045776146d767eb39b4c327518d442413a48a6ceb79c638b80549ff337" Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.484391 4867 scope.go:117] "RemoveContainer" containerID="d491e79699e9554560013d7b72655c67cb6101801fc6f13707311b96861ccdda" Oct 06 13:27:51 crc kubenswrapper[4867]: E1006 13:27:51.484911 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d491e79699e9554560013d7b72655c67cb6101801fc6f13707311b96861ccdda\": container with ID starting with d491e79699e9554560013d7b72655c67cb6101801fc6f13707311b96861ccdda not found: ID does not exist" containerID="d491e79699e9554560013d7b72655c67cb6101801fc6f13707311b96861ccdda" Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.484976 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d491e79699e9554560013d7b72655c67cb6101801fc6f13707311b96861ccdda"} err="failed to get container status \"d491e79699e9554560013d7b72655c67cb6101801fc6f13707311b96861ccdda\": rpc error: code = NotFound desc = could not find container \"d491e79699e9554560013d7b72655c67cb6101801fc6f13707311b96861ccdda\": container with ID starting with d491e79699e9554560013d7b72655c67cb6101801fc6f13707311b96861ccdda not found: ID does not exist" Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.485023 4867 scope.go:117] "RemoveContainer" containerID="f02e7b64229648f48713aa1bda0353153823506f33f7336bc4debb77046a4353" Oct 06 13:27:51 crc kubenswrapper[4867]: E1006 13:27:51.485634 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f02e7b64229648f48713aa1bda0353153823506f33f7336bc4debb77046a4353\": container with ID starting with f02e7b64229648f48713aa1bda0353153823506f33f7336bc4debb77046a4353 not found: ID does not exist" containerID="f02e7b64229648f48713aa1bda0353153823506f33f7336bc4debb77046a4353" Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.485670 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02e7b64229648f48713aa1bda0353153823506f33f7336bc4debb77046a4353"} err="failed to get container status \"f02e7b64229648f48713aa1bda0353153823506f33f7336bc4debb77046a4353\": rpc error: code = NotFound desc = could not find container \"f02e7b64229648f48713aa1bda0353153823506f33f7336bc4debb77046a4353\": container with ID starting with f02e7b64229648f48713aa1bda0353153823506f33f7336bc4debb77046a4353 not found: ID does not exist" Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.485713 4867 scope.go:117] "RemoveContainer" containerID="3e756a045776146d767eb39b4c327518d442413a48a6ceb79c638b80549ff337" Oct 06 13:27:51 crc kubenswrapper[4867]: E1006 13:27:51.486096 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e756a045776146d767eb39b4c327518d442413a48a6ceb79c638b80549ff337\": container with ID starting with 3e756a045776146d767eb39b4c327518d442413a48a6ceb79c638b80549ff337 not found: ID does not exist" containerID="3e756a045776146d767eb39b4c327518d442413a48a6ceb79c638b80549ff337" Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.486127 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e756a045776146d767eb39b4c327518d442413a48a6ceb79c638b80549ff337"} err="failed to get container status \"3e756a045776146d767eb39b4c327518d442413a48a6ceb79c638b80549ff337\": rpc error: code = NotFound desc = could not find container \"3e756a045776146d767eb39b4c327518d442413a48a6ceb79c638b80549ff337\": container with ID starting with 3e756a045776146d767eb39b4c327518d442413a48a6ceb79c638b80549ff337 not found: ID does not exist" Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.604092 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.620811 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:51 crc kubenswrapper[4867]: I1006 13:27:51.676984 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:52 crc kubenswrapper[4867]: I1006 13:27:52.411972 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:53 crc kubenswrapper[4867]: I1006 13:27:53.233398 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feaa649b-32b9-4f3b-afb4-00627c6b4f75" path="/var/lib/kubelet/pods/feaa649b-32b9-4f3b-afb4-00627c6b4f75/volumes" Oct 06 13:27:53 crc kubenswrapper[4867]: I1006 13:27:53.863027 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cg2p6"] Oct 06 13:27:55 crc kubenswrapper[4867]: I1006 13:27:55.400159 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cg2p6" podUID="c4dc7907-3f4f-492e-8916-0e4fbeb11148" containerName="registry-server" containerID="cri-o://4c448b1aaaea03f3b097f74e24f08d9c5476e596c48a6cc6e7600dc6fea43c9c" gracePeriod=2 Oct 06 13:27:55 crc kubenswrapper[4867]: I1006 13:27:55.870693 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:55 crc kubenswrapper[4867]: I1006 13:27:55.924928 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4dc7907-3f4f-492e-8916-0e4fbeb11148-utilities\") pod \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\" (UID: \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\") " Oct 06 13:27:55 crc kubenswrapper[4867]: I1006 13:27:55.925022 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjqtl\" (UniqueName: \"kubernetes.io/projected/c4dc7907-3f4f-492e-8916-0e4fbeb11148-kube-api-access-rjqtl\") pod \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\" (UID: \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\") " Oct 06 13:27:55 crc kubenswrapper[4867]: I1006 13:27:55.925045 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4dc7907-3f4f-492e-8916-0e4fbeb11148-catalog-content\") pod \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\" (UID: \"c4dc7907-3f4f-492e-8916-0e4fbeb11148\") " Oct 06 13:27:55 crc kubenswrapper[4867]: I1006 13:27:55.925835 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4dc7907-3f4f-492e-8916-0e4fbeb11148-utilities" (OuterVolumeSpecName: "utilities") pod "c4dc7907-3f4f-492e-8916-0e4fbeb11148" (UID: "c4dc7907-3f4f-492e-8916-0e4fbeb11148"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:27:55 crc kubenswrapper[4867]: I1006 13:27:55.930018 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4dc7907-3f4f-492e-8916-0e4fbeb11148-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:27:55 crc kubenswrapper[4867]: I1006 13:27:55.935518 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4dc7907-3f4f-492e-8916-0e4fbeb11148-kube-api-access-rjqtl" (OuterVolumeSpecName: "kube-api-access-rjqtl") pod "c4dc7907-3f4f-492e-8916-0e4fbeb11148" (UID: "c4dc7907-3f4f-492e-8916-0e4fbeb11148"). InnerVolumeSpecName "kube-api-access-rjqtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:27:55 crc kubenswrapper[4867]: I1006 13:27:55.971042 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4dc7907-3f4f-492e-8916-0e4fbeb11148-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4dc7907-3f4f-492e-8916-0e4fbeb11148" (UID: "c4dc7907-3f4f-492e-8916-0e4fbeb11148"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.032243 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjqtl\" (UniqueName: \"kubernetes.io/projected/c4dc7907-3f4f-492e-8916-0e4fbeb11148-kube-api-access-rjqtl\") on node \"crc\" DevicePath \"\"" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.032285 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4dc7907-3f4f-492e-8916-0e4fbeb11148-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.413997 4867 generic.go:334] "Generic (PLEG): container finished" podID="c4dc7907-3f4f-492e-8916-0e4fbeb11148" containerID="4c448b1aaaea03f3b097f74e24f08d9c5476e596c48a6cc6e7600dc6fea43c9c" exitCode=0 Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.414045 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cg2p6" event={"ID":"c4dc7907-3f4f-492e-8916-0e4fbeb11148","Type":"ContainerDied","Data":"4c448b1aaaea03f3b097f74e24f08d9c5476e596c48a6cc6e7600dc6fea43c9c"} Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.414081 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cg2p6" event={"ID":"c4dc7907-3f4f-492e-8916-0e4fbeb11148","Type":"ContainerDied","Data":"bd77c9bfab5ed1a8e2d7c137420934c4914f89bf861632d9ce980ac1a232aeb7"} Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.414089 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cg2p6" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.414097 4867 scope.go:117] "RemoveContainer" containerID="4c448b1aaaea03f3b097f74e24f08d9c5476e596c48a6cc6e7600dc6fea43c9c" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.435302 4867 scope.go:117] "RemoveContainer" containerID="528b0e239d9d5d2f49040633b900f4492f1404b48ec8b489f88315986e860195" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.471086 4867 scope.go:117] "RemoveContainer" containerID="cb2b05ec41a98e798cc3b3ac7a00f3096e165c6da327450092a7ad59c654ac68" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.474641 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cg2p6"] Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.486602 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cg2p6"] Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.493078 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b4jmb"] Oct 06 13:27:56 crc kubenswrapper[4867]: E1006 13:27:56.493555 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4dc7907-3f4f-492e-8916-0e4fbeb11148" containerName="registry-server" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.493571 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4dc7907-3f4f-492e-8916-0e4fbeb11148" containerName="registry-server" Oct 06 13:27:56 crc kubenswrapper[4867]: E1006 13:27:56.493588 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4dc7907-3f4f-492e-8916-0e4fbeb11148" containerName="extract-content" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.493594 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4dc7907-3f4f-492e-8916-0e4fbeb11148" containerName="extract-content" Oct 06 13:27:56 crc kubenswrapper[4867]: E1006 13:27:56.493607 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feaa649b-32b9-4f3b-afb4-00627c6b4f75" containerName="extract-utilities" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.493614 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="feaa649b-32b9-4f3b-afb4-00627c6b4f75" containerName="extract-utilities" Oct 06 13:27:56 crc kubenswrapper[4867]: E1006 13:27:56.493624 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4dc7907-3f4f-492e-8916-0e4fbeb11148" containerName="extract-utilities" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.493631 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4dc7907-3f4f-492e-8916-0e4fbeb11148" containerName="extract-utilities" Oct 06 13:27:56 crc kubenswrapper[4867]: E1006 13:27:56.493667 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feaa649b-32b9-4f3b-afb4-00627c6b4f75" containerName="registry-server" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.493672 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="feaa649b-32b9-4f3b-afb4-00627c6b4f75" containerName="registry-server" Oct 06 13:27:56 crc kubenswrapper[4867]: E1006 13:27:56.493686 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feaa649b-32b9-4f3b-afb4-00627c6b4f75" containerName="extract-content" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.493692 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="feaa649b-32b9-4f3b-afb4-00627c6b4f75" containerName="extract-content" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.493875 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="feaa649b-32b9-4f3b-afb4-00627c6b4f75" containerName="registry-server" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.493902 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4dc7907-3f4f-492e-8916-0e4fbeb11148" containerName="registry-server" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.495343 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.503355 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b4jmb"] Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.537755 4867 scope.go:117] "RemoveContainer" containerID="4c448b1aaaea03f3b097f74e24f08d9c5476e596c48a6cc6e7600dc6fea43c9c" Oct 06 13:27:56 crc kubenswrapper[4867]: E1006 13:27:56.538245 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c448b1aaaea03f3b097f74e24f08d9c5476e596c48a6cc6e7600dc6fea43c9c\": container with ID starting with 4c448b1aaaea03f3b097f74e24f08d9c5476e596c48a6cc6e7600dc6fea43c9c not found: ID does not exist" containerID="4c448b1aaaea03f3b097f74e24f08d9c5476e596c48a6cc6e7600dc6fea43c9c" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.538318 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c448b1aaaea03f3b097f74e24f08d9c5476e596c48a6cc6e7600dc6fea43c9c"} err="failed to get container status \"4c448b1aaaea03f3b097f74e24f08d9c5476e596c48a6cc6e7600dc6fea43c9c\": rpc error: code = NotFound desc = could not find container \"4c448b1aaaea03f3b097f74e24f08d9c5476e596c48a6cc6e7600dc6fea43c9c\": container with ID starting with 4c448b1aaaea03f3b097f74e24f08d9c5476e596c48a6cc6e7600dc6fea43c9c not found: ID does not exist" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.538344 4867 scope.go:117] "RemoveContainer" containerID="528b0e239d9d5d2f49040633b900f4492f1404b48ec8b489f88315986e860195" Oct 06 13:27:56 crc kubenswrapper[4867]: E1006 13:27:56.538713 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"528b0e239d9d5d2f49040633b900f4492f1404b48ec8b489f88315986e860195\": container with ID starting with 528b0e239d9d5d2f49040633b900f4492f1404b48ec8b489f88315986e860195 not found: ID does not exist" containerID="528b0e239d9d5d2f49040633b900f4492f1404b48ec8b489f88315986e860195" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.538765 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528b0e239d9d5d2f49040633b900f4492f1404b48ec8b489f88315986e860195"} err="failed to get container status \"528b0e239d9d5d2f49040633b900f4492f1404b48ec8b489f88315986e860195\": rpc error: code = NotFound desc = could not find container \"528b0e239d9d5d2f49040633b900f4492f1404b48ec8b489f88315986e860195\": container with ID starting with 528b0e239d9d5d2f49040633b900f4492f1404b48ec8b489f88315986e860195 not found: ID does not exist" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.538798 4867 scope.go:117] "RemoveContainer" containerID="cb2b05ec41a98e798cc3b3ac7a00f3096e165c6da327450092a7ad59c654ac68" Oct 06 13:27:56 crc kubenswrapper[4867]: E1006 13:27:56.539242 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb2b05ec41a98e798cc3b3ac7a00f3096e165c6da327450092a7ad59c654ac68\": container with ID starting with cb2b05ec41a98e798cc3b3ac7a00f3096e165c6da327450092a7ad59c654ac68 not found: ID does not exist" containerID="cb2b05ec41a98e798cc3b3ac7a00f3096e165c6da327450092a7ad59c654ac68" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.539280 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb2b05ec41a98e798cc3b3ac7a00f3096e165c6da327450092a7ad59c654ac68"} err="failed to get container status \"cb2b05ec41a98e798cc3b3ac7a00f3096e165c6da327450092a7ad59c654ac68\": rpc error: code = NotFound desc = could not find container \"cb2b05ec41a98e798cc3b3ac7a00f3096e165c6da327450092a7ad59c654ac68\": container with ID starting with cb2b05ec41a98e798cc3b3ac7a00f3096e165c6da327450092a7ad59c654ac68 not found: ID does not exist" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.541023 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsdrt\" (UniqueName: \"kubernetes.io/projected/56ae859c-1679-46cc-a3ad-9f7f72bb6472-kube-api-access-zsdrt\") pod \"redhat-operators-b4jmb\" (UID: \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\") " pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.541180 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ae859c-1679-46cc-a3ad-9f7f72bb6472-catalog-content\") pod \"redhat-operators-b4jmb\" (UID: \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\") " pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.541263 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ae859c-1679-46cc-a3ad-9f7f72bb6472-utilities\") pod \"redhat-operators-b4jmb\" (UID: \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\") " pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.643274 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsdrt\" (UniqueName: \"kubernetes.io/projected/56ae859c-1679-46cc-a3ad-9f7f72bb6472-kube-api-access-zsdrt\") pod \"redhat-operators-b4jmb\" (UID: \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\") " pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.643445 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ae859c-1679-46cc-a3ad-9f7f72bb6472-catalog-content\") pod \"redhat-operators-b4jmb\" (UID: \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\") " pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.643496 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ae859c-1679-46cc-a3ad-9f7f72bb6472-utilities\") pod \"redhat-operators-b4jmb\" (UID: \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\") " pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.644208 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ae859c-1679-46cc-a3ad-9f7f72bb6472-utilities\") pod \"redhat-operators-b4jmb\" (UID: \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\") " pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.644227 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ae859c-1679-46cc-a3ad-9f7f72bb6472-catalog-content\") pod \"redhat-operators-b4jmb\" (UID: \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\") " pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.661820 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsdrt\" (UniqueName: \"kubernetes.io/projected/56ae859c-1679-46cc-a3ad-9f7f72bb6472-kube-api-access-zsdrt\") pod \"redhat-operators-b4jmb\" (UID: \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\") " pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:27:56 crc kubenswrapper[4867]: I1006 13:27:56.891484 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:27:57 crc kubenswrapper[4867]: I1006 13:27:57.237479 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4dc7907-3f4f-492e-8916-0e4fbeb11148" path="/var/lib/kubelet/pods/c4dc7907-3f4f-492e-8916-0e4fbeb11148/volumes" Oct 06 13:27:57 crc kubenswrapper[4867]: I1006 13:27:57.423929 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b4jmb"] Oct 06 13:27:57 crc kubenswrapper[4867]: W1006 13:27:57.438109 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ae859c_1679_46cc_a3ad_9f7f72bb6472.slice/crio-44d476962656ec25b2bef22c2dc2aee14a64ccf0bcaf1d3506ca375f5d4ab67c WatchSource:0}: Error finding container 44d476962656ec25b2bef22c2dc2aee14a64ccf0bcaf1d3506ca375f5d4ab67c: Status 404 returned error can't find the container with id 44d476962656ec25b2bef22c2dc2aee14a64ccf0bcaf1d3506ca375f5d4ab67c Oct 06 13:27:58 crc kubenswrapper[4867]: I1006 13:27:58.444492 4867 generic.go:334] "Generic (PLEG): container finished" podID="56ae859c-1679-46cc-a3ad-9f7f72bb6472" containerID="ecbb48901bb876efbd7ba19abf0caaa4cafe483617f01f93ab6f4a98cd703db5" exitCode=0 Oct 06 13:27:58 crc kubenswrapper[4867]: I1006 13:27:58.444542 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4jmb" event={"ID":"56ae859c-1679-46cc-a3ad-9f7f72bb6472","Type":"ContainerDied","Data":"ecbb48901bb876efbd7ba19abf0caaa4cafe483617f01f93ab6f4a98cd703db5"} Oct 06 13:27:58 crc kubenswrapper[4867]: I1006 13:27:58.444822 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4jmb" event={"ID":"56ae859c-1679-46cc-a3ad-9f7f72bb6472","Type":"ContainerStarted","Data":"44d476962656ec25b2bef22c2dc2aee14a64ccf0bcaf1d3506ca375f5d4ab67c"} Oct 06 13:27:59 crc kubenswrapper[4867]: I1006 13:27:59.863721 4867 scope.go:117] "RemoveContainer" containerID="232360b71bbd8600afe387f269e707b6841ddf8870e8cdea62fed2b11ab1c732" Oct 06 13:28:00 crc kubenswrapper[4867]: I1006 13:28:00.467359 4867 generic.go:334] "Generic (PLEG): container finished" podID="56ae859c-1679-46cc-a3ad-9f7f72bb6472" containerID="61e19d41b352458a9aa39f02c17ea3edaab04820f61662c9861ac063f732521a" exitCode=0 Oct 06 13:28:00 crc kubenswrapper[4867]: I1006 13:28:00.467419 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4jmb" event={"ID":"56ae859c-1679-46cc-a3ad-9f7f72bb6472","Type":"ContainerDied","Data":"61e19d41b352458a9aa39f02c17ea3edaab04820f61662c9861ac063f732521a"} Oct 06 13:28:01 crc kubenswrapper[4867]: I1006 13:28:01.478599 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4jmb" event={"ID":"56ae859c-1679-46cc-a3ad-9f7f72bb6472","Type":"ContainerStarted","Data":"2496cefee2fa62af30ca4e79df40159d7a66be1ab0057e9d24ecc067f9375dc1"} Oct 06 13:28:01 crc kubenswrapper[4867]: I1006 13:28:01.497743 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b4jmb" podStartSLOduration=2.877334494 podStartE2EDuration="5.497726495s" podCreationTimestamp="2025-10-06 13:27:56 +0000 UTC" firstStartedPulling="2025-10-06 13:27:58.446156896 +0000 UTC m=+1457.904105040" lastFinishedPulling="2025-10-06 13:28:01.066548897 +0000 UTC m=+1460.524497041" observedRunningTime="2025-10-06 13:28:01.496674936 +0000 UTC m=+1460.954623070" watchObservedRunningTime="2025-10-06 13:28:01.497726495 +0000 UTC m=+1460.955674639" Oct 06 13:28:06 crc kubenswrapper[4867]: I1006 13:28:06.891828 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:28:06 crc kubenswrapper[4867]: I1006 13:28:06.892352 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:28:06 crc kubenswrapper[4867]: I1006 13:28:06.944698 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:28:07 crc kubenswrapper[4867]: I1006 13:28:07.578078 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:28:07 crc kubenswrapper[4867]: I1006 13:28:07.626485 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b4jmb"] Oct 06 13:28:09 crc kubenswrapper[4867]: I1006 13:28:09.547787 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b4jmb" podUID="56ae859c-1679-46cc-a3ad-9f7f72bb6472" containerName="registry-server" containerID="cri-o://2496cefee2fa62af30ca4e79df40159d7a66be1ab0057e9d24ecc067f9375dc1" gracePeriod=2 Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.044508 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.129350 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsdrt\" (UniqueName: \"kubernetes.io/projected/56ae859c-1679-46cc-a3ad-9f7f72bb6472-kube-api-access-zsdrt\") pod \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\" (UID: \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\") " Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.129998 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ae859c-1679-46cc-a3ad-9f7f72bb6472-catalog-content\") pod \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\" (UID: \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\") " Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.130143 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ae859c-1679-46cc-a3ad-9f7f72bb6472-utilities\") pod \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\" (UID: \"56ae859c-1679-46cc-a3ad-9f7f72bb6472\") " Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.131193 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56ae859c-1679-46cc-a3ad-9f7f72bb6472-utilities" (OuterVolumeSpecName: "utilities") pod "56ae859c-1679-46cc-a3ad-9f7f72bb6472" (UID: "56ae859c-1679-46cc-a3ad-9f7f72bb6472"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.135667 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56ae859c-1679-46cc-a3ad-9f7f72bb6472-kube-api-access-zsdrt" (OuterVolumeSpecName: "kube-api-access-zsdrt") pod "56ae859c-1679-46cc-a3ad-9f7f72bb6472" (UID: "56ae859c-1679-46cc-a3ad-9f7f72bb6472"). InnerVolumeSpecName "kube-api-access-zsdrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.228065 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56ae859c-1679-46cc-a3ad-9f7f72bb6472-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56ae859c-1679-46cc-a3ad-9f7f72bb6472" (UID: "56ae859c-1679-46cc-a3ad-9f7f72bb6472"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.233729 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsdrt\" (UniqueName: \"kubernetes.io/projected/56ae859c-1679-46cc-a3ad-9f7f72bb6472-kube-api-access-zsdrt\") on node \"crc\" DevicePath \"\"" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.233769 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56ae859c-1679-46cc-a3ad-9f7f72bb6472-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.233779 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56ae859c-1679-46cc-a3ad-9f7f72bb6472-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.558035 4867 generic.go:334] "Generic (PLEG): container finished" podID="56ae859c-1679-46cc-a3ad-9f7f72bb6472" containerID="2496cefee2fa62af30ca4e79df40159d7a66be1ab0057e9d24ecc067f9375dc1" exitCode=0 Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.558124 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b4jmb" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.558110 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4jmb" event={"ID":"56ae859c-1679-46cc-a3ad-9f7f72bb6472","Type":"ContainerDied","Data":"2496cefee2fa62af30ca4e79df40159d7a66be1ab0057e9d24ecc067f9375dc1"} Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.558312 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4jmb" event={"ID":"56ae859c-1679-46cc-a3ad-9f7f72bb6472","Type":"ContainerDied","Data":"44d476962656ec25b2bef22c2dc2aee14a64ccf0bcaf1d3506ca375f5d4ab67c"} Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.558336 4867 scope.go:117] "RemoveContainer" containerID="2496cefee2fa62af30ca4e79df40159d7a66be1ab0057e9d24ecc067f9375dc1" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.587191 4867 scope.go:117] "RemoveContainer" containerID="61e19d41b352458a9aa39f02c17ea3edaab04820f61662c9861ac063f732521a" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.607434 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b4jmb"] Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.617866 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b4jmb"] Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.637927 4867 scope.go:117] "RemoveContainer" containerID="ecbb48901bb876efbd7ba19abf0caaa4cafe483617f01f93ab6f4a98cd703db5" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.664384 4867 scope.go:117] "RemoveContainer" containerID="2496cefee2fa62af30ca4e79df40159d7a66be1ab0057e9d24ecc067f9375dc1" Oct 06 13:28:10 crc kubenswrapper[4867]: E1006 13:28:10.665563 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2496cefee2fa62af30ca4e79df40159d7a66be1ab0057e9d24ecc067f9375dc1\": container with ID starting with 2496cefee2fa62af30ca4e79df40159d7a66be1ab0057e9d24ecc067f9375dc1 not found: ID does not exist" containerID="2496cefee2fa62af30ca4e79df40159d7a66be1ab0057e9d24ecc067f9375dc1" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.665789 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2496cefee2fa62af30ca4e79df40159d7a66be1ab0057e9d24ecc067f9375dc1"} err="failed to get container status \"2496cefee2fa62af30ca4e79df40159d7a66be1ab0057e9d24ecc067f9375dc1\": rpc error: code = NotFound desc = could not find container \"2496cefee2fa62af30ca4e79df40159d7a66be1ab0057e9d24ecc067f9375dc1\": container with ID starting with 2496cefee2fa62af30ca4e79df40159d7a66be1ab0057e9d24ecc067f9375dc1 not found: ID does not exist" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.665815 4867 scope.go:117] "RemoveContainer" containerID="61e19d41b352458a9aa39f02c17ea3edaab04820f61662c9861ac063f732521a" Oct 06 13:28:10 crc kubenswrapper[4867]: E1006 13:28:10.666242 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61e19d41b352458a9aa39f02c17ea3edaab04820f61662c9861ac063f732521a\": container with ID starting with 61e19d41b352458a9aa39f02c17ea3edaab04820f61662c9861ac063f732521a not found: ID does not exist" containerID="61e19d41b352458a9aa39f02c17ea3edaab04820f61662c9861ac063f732521a" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.666286 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e19d41b352458a9aa39f02c17ea3edaab04820f61662c9861ac063f732521a"} err="failed to get container status \"61e19d41b352458a9aa39f02c17ea3edaab04820f61662c9861ac063f732521a\": rpc error: code = NotFound desc = could not find container \"61e19d41b352458a9aa39f02c17ea3edaab04820f61662c9861ac063f732521a\": container with ID starting with 61e19d41b352458a9aa39f02c17ea3edaab04820f61662c9861ac063f732521a not found: ID does not exist" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.666309 4867 scope.go:117] "RemoveContainer" containerID="ecbb48901bb876efbd7ba19abf0caaa4cafe483617f01f93ab6f4a98cd703db5" Oct 06 13:28:10 crc kubenswrapper[4867]: E1006 13:28:10.666664 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecbb48901bb876efbd7ba19abf0caaa4cafe483617f01f93ab6f4a98cd703db5\": container with ID starting with ecbb48901bb876efbd7ba19abf0caaa4cafe483617f01f93ab6f4a98cd703db5 not found: ID does not exist" containerID="ecbb48901bb876efbd7ba19abf0caaa4cafe483617f01f93ab6f4a98cd703db5" Oct 06 13:28:10 crc kubenswrapper[4867]: I1006 13:28:10.666769 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecbb48901bb876efbd7ba19abf0caaa4cafe483617f01f93ab6f4a98cd703db5"} err="failed to get container status \"ecbb48901bb876efbd7ba19abf0caaa4cafe483617f01f93ab6f4a98cd703db5\": rpc error: code = NotFound desc = could not find container \"ecbb48901bb876efbd7ba19abf0caaa4cafe483617f01f93ab6f4a98cd703db5\": container with ID starting with ecbb48901bb876efbd7ba19abf0caaa4cafe483617f01f93ab6f4a98cd703db5 not found: ID does not exist" Oct 06 13:28:11 crc kubenswrapper[4867]: I1006 13:28:11.233907 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56ae859c-1679-46cc-a3ad-9f7f72bb6472" path="/var/lib/kubelet/pods/56ae859c-1679-46cc-a3ad-9f7f72bb6472/volumes" Oct 06 13:28:12 crc kubenswrapper[4867]: I1006 13:28:12.873513 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:28:12 crc kubenswrapper[4867]: I1006 13:28:12.873573 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:28:12 crc kubenswrapper[4867]: I1006 13:28:12.873614 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:28:12 crc kubenswrapper[4867]: I1006 13:28:12.874391 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"708d16f9a6115595b008bafc5ad0e6ec3528bd438b87bd249255c174238bf7ec"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:28:12 crc kubenswrapper[4867]: I1006 13:28:12.874450 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://708d16f9a6115595b008bafc5ad0e6ec3528bd438b87bd249255c174238bf7ec" gracePeriod=600 Oct 06 13:28:13 crc kubenswrapper[4867]: I1006 13:28:13.592867 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="708d16f9a6115595b008bafc5ad0e6ec3528bd438b87bd249255c174238bf7ec" exitCode=0 Oct 06 13:28:13 crc kubenswrapper[4867]: I1006 13:28:13.598070 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"708d16f9a6115595b008bafc5ad0e6ec3528bd438b87bd249255c174238bf7ec"} Oct 06 13:28:13 crc kubenswrapper[4867]: I1006 13:28:13.598129 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d"} Oct 06 13:28:13 crc kubenswrapper[4867]: I1006 13:28:13.598163 4867 scope.go:117] "RemoveContainer" containerID="266184608e50b4d6729b714d56d0cdb437a575eeec8c7e5f18126b05fc5a103e" Oct 06 13:28:59 crc kubenswrapper[4867]: I1006 13:28:59.957533 4867 scope.go:117] "RemoveContainer" containerID="925011d1366643c86474669b03d4c9cd1af3c1c28b81be813f68d4b94ec7c743" Oct 06 13:28:59 crc kubenswrapper[4867]: I1006 13:28:59.992185 4867 scope.go:117] "RemoveContainer" containerID="d8758d3ca0058ce6cc97efb76387c8f23b72b5e8f190be9e2941e226b9989a2a" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.169271 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt"] Oct 06 13:30:00 crc kubenswrapper[4867]: E1006 13:30:00.170409 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ae859c-1679-46cc-a3ad-9f7f72bb6472" containerName="registry-server" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.170475 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ae859c-1679-46cc-a3ad-9f7f72bb6472" containerName="registry-server" Oct 06 13:30:00 crc kubenswrapper[4867]: E1006 13:30:00.170519 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ae859c-1679-46cc-a3ad-9f7f72bb6472" containerName="extract-content" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.170527 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ae859c-1679-46cc-a3ad-9f7f72bb6472" containerName="extract-content" Oct 06 13:30:00 crc kubenswrapper[4867]: E1006 13:30:00.170548 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ae859c-1679-46cc-a3ad-9f7f72bb6472" containerName="extract-utilities" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.170557 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ae859c-1679-46cc-a3ad-9f7f72bb6472" containerName="extract-utilities" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.170858 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ae859c-1679-46cc-a3ad-9f7f72bb6472" containerName="registry-server" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.171876 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.174793 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.181865 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.197450 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt"] Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.229733 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-secret-volume\") pod \"collect-profiles-29329290-lz9jt\" (UID: \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.230089 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5bw9\" (UniqueName: \"kubernetes.io/projected/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-kube-api-access-z5bw9\") pod \"collect-profiles-29329290-lz9jt\" (UID: \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.230582 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-config-volume\") pod \"collect-profiles-29329290-lz9jt\" (UID: \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.332494 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5bw9\" (UniqueName: \"kubernetes.io/projected/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-kube-api-access-z5bw9\") pod \"collect-profiles-29329290-lz9jt\" (UID: \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.333966 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-config-volume\") pod \"collect-profiles-29329290-lz9jt\" (UID: \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.334052 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-secret-volume\") pod \"collect-profiles-29329290-lz9jt\" (UID: \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.334857 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-config-volume\") pod \"collect-profiles-29329290-lz9jt\" (UID: \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.349032 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-secret-volume\") pod \"collect-profiles-29329290-lz9jt\" (UID: \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.349305 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5bw9\" (UniqueName: \"kubernetes.io/projected/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-kube-api-access-z5bw9\") pod \"collect-profiles-29329290-lz9jt\" (UID: \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.490769 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" Oct 06 13:30:00 crc kubenswrapper[4867]: I1006 13:30:00.968808 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt"] Oct 06 13:30:01 crc kubenswrapper[4867]: I1006 13:30:01.760652 4867 generic.go:334] "Generic (PLEG): container finished" podID="7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39" containerID="d4d2020afc3220f8bc4e6634e6c2733dd35251a23cc9c0d03751c753678a3c8b" exitCode=0 Oct 06 13:30:01 crc kubenswrapper[4867]: I1006 13:30:01.760792 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" event={"ID":"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39","Type":"ContainerDied","Data":"d4d2020afc3220f8bc4e6634e6c2733dd35251a23cc9c0d03751c753678a3c8b"} Oct 06 13:30:01 crc kubenswrapper[4867]: I1006 13:30:01.761426 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" event={"ID":"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39","Type":"ContainerStarted","Data":"807617bd9b6ef6cfc9b21c4d84063e1138f4657948b7e6beea5086fd1f5a4c34"} Oct 06 13:30:03 crc kubenswrapper[4867]: I1006 13:30:03.126416 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" Oct 06 13:30:03 crc kubenswrapper[4867]: I1006 13:30:03.219225 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-config-volume\") pod \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\" (UID: \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\") " Oct 06 13:30:03 crc kubenswrapper[4867]: I1006 13:30:03.219330 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5bw9\" (UniqueName: \"kubernetes.io/projected/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-kube-api-access-z5bw9\") pod \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\" (UID: \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\") " Oct 06 13:30:03 crc kubenswrapper[4867]: I1006 13:30:03.219570 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-secret-volume\") pod \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\" (UID: \"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39\") " Oct 06 13:30:03 crc kubenswrapper[4867]: I1006 13:30:03.220720 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-config-volume" (OuterVolumeSpecName: "config-volume") pod "7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39" (UID: "7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:30:03 crc kubenswrapper[4867]: I1006 13:30:03.231072 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39" (UID: "7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:30:03 crc kubenswrapper[4867]: I1006 13:30:03.231177 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-kube-api-access-z5bw9" (OuterVolumeSpecName: "kube-api-access-z5bw9") pod "7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39" (UID: "7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39"). InnerVolumeSpecName "kube-api-access-z5bw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:30:03 crc kubenswrapper[4867]: I1006 13:30:03.323797 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:30:03 crc kubenswrapper[4867]: I1006 13:30:03.324189 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:30:03 crc kubenswrapper[4867]: I1006 13:30:03.324203 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5bw9\" (UniqueName: \"kubernetes.io/projected/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39-kube-api-access-z5bw9\") on node \"crc\" DevicePath \"\"" Oct 06 13:30:03 crc kubenswrapper[4867]: I1006 13:30:03.789220 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" event={"ID":"7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39","Type":"ContainerDied","Data":"807617bd9b6ef6cfc9b21c4d84063e1138f4657948b7e6beea5086fd1f5a4c34"} Oct 06 13:30:03 crc kubenswrapper[4867]: I1006 13:30:03.789283 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="807617bd9b6ef6cfc9b21c4d84063e1138f4657948b7e6beea5086fd1f5a4c34" Oct 06 13:30:03 crc kubenswrapper[4867]: I1006 13:30:03.789316 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt" Oct 06 13:30:20 crc kubenswrapper[4867]: I1006 13:30:20.049011 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-8lllr"] Oct 06 13:30:20 crc kubenswrapper[4867]: I1006 13:30:20.062699 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-8lllr"] Oct 06 13:30:21 crc kubenswrapper[4867]: I1006 13:30:21.234336 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e" path="/var/lib/kubelet/pods/cec9595d-7c4e-4fab-b85b-b5c3a70aeb6e/volumes" Oct 06 13:30:23 crc kubenswrapper[4867]: I1006 13:30:23.979114 4867 generic.go:334] "Generic (PLEG): container finished" podID="e7ba5c1b-0dcb-4509-bb81-4bda347944bf" containerID="ce81db6380b3c441fcd17dda3af6d80e3a190f49f5e40efc6cc8d443dd9f1492" exitCode=0 Oct 06 13:30:23 crc kubenswrapper[4867]: I1006 13:30:23.979187 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" event={"ID":"e7ba5c1b-0dcb-4509-bb81-4bda347944bf","Type":"ContainerDied","Data":"ce81db6380b3c441fcd17dda3af6d80e3a190f49f5e40efc6cc8d443dd9f1492"} Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.417535 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.523038 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkvv6\" (UniqueName: \"kubernetes.io/projected/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-kube-api-access-wkvv6\") pod \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.523112 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-bootstrap-combined-ca-bundle\") pod \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.523158 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-inventory\") pod \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.523297 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-ssh-key\") pod \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\" (UID: \"e7ba5c1b-0dcb-4509-bb81-4bda347944bf\") " Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.529061 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e7ba5c1b-0dcb-4509-bb81-4bda347944bf" (UID: "e7ba5c1b-0dcb-4509-bb81-4bda347944bf"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.529291 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-kube-api-access-wkvv6" (OuterVolumeSpecName: "kube-api-access-wkvv6") pod "e7ba5c1b-0dcb-4509-bb81-4bda347944bf" (UID: "e7ba5c1b-0dcb-4509-bb81-4bda347944bf"). InnerVolumeSpecName "kube-api-access-wkvv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.557279 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e7ba5c1b-0dcb-4509-bb81-4bda347944bf" (UID: "e7ba5c1b-0dcb-4509-bb81-4bda347944bf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.558466 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-inventory" (OuterVolumeSpecName: "inventory") pod "e7ba5c1b-0dcb-4509-bb81-4bda347944bf" (UID: "e7ba5c1b-0dcb-4509-bb81-4bda347944bf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.625724 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.625761 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkvv6\" (UniqueName: \"kubernetes.io/projected/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-kube-api-access-wkvv6\") on node \"crc\" DevicePath \"\"" Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.625773 4867 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.625783 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7ba5c1b-0dcb-4509-bb81-4bda347944bf-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.998237 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" event={"ID":"e7ba5c1b-0dcb-4509-bb81-4bda347944bf","Type":"ContainerDied","Data":"93d5cbed38e51e3858fcb76e93915988df62fb9d504c49cdcaaf6ffabc75b962"} Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.998299 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93d5cbed38e51e3858fcb76e93915988df62fb9d504c49cdcaaf6ffabc75b962" Oct 06 13:30:25 crc kubenswrapper[4867]: I1006 13:30:25.998333 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.119944 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5"] Oct 06 13:30:26 crc kubenswrapper[4867]: E1006 13:30:26.120409 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ba5c1b-0dcb-4509-bb81-4bda347944bf" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.120427 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ba5c1b-0dcb-4509-bb81-4bda347944bf" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 13:30:26 crc kubenswrapper[4867]: E1006 13:30:26.120462 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39" containerName="collect-profiles" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.120470 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39" containerName="collect-profiles" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.120659 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39" containerName="collect-profiles" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.120672 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7ba5c1b-0dcb-4509-bb81-4bda347944bf" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.121385 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.123637 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.123851 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.132193 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.132822 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.134502 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5"] Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.240878 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b214a0d6-e528-435a-9126-04d18492d264-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5\" (UID: \"b214a0d6-e528-435a-9126-04d18492d264\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.240987 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5v88\" (UniqueName: \"kubernetes.io/projected/b214a0d6-e528-435a-9126-04d18492d264-kube-api-access-w5v88\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5\" (UID: \"b214a0d6-e528-435a-9126-04d18492d264\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.241045 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b214a0d6-e528-435a-9126-04d18492d264-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5\" (UID: \"b214a0d6-e528-435a-9126-04d18492d264\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.343077 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5v88\" (UniqueName: \"kubernetes.io/projected/b214a0d6-e528-435a-9126-04d18492d264-kube-api-access-w5v88\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5\" (UID: \"b214a0d6-e528-435a-9126-04d18492d264\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.343577 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b214a0d6-e528-435a-9126-04d18492d264-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5\" (UID: \"b214a0d6-e528-435a-9126-04d18492d264\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.343754 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b214a0d6-e528-435a-9126-04d18492d264-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5\" (UID: \"b214a0d6-e528-435a-9126-04d18492d264\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.348570 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b214a0d6-e528-435a-9126-04d18492d264-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5\" (UID: \"b214a0d6-e528-435a-9126-04d18492d264\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.354733 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b214a0d6-e528-435a-9126-04d18492d264-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5\" (UID: \"b214a0d6-e528-435a-9126-04d18492d264\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.360498 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5v88\" (UniqueName: \"kubernetes.io/projected/b214a0d6-e528-435a-9126-04d18492d264-kube-api-access-w5v88\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5\" (UID: \"b214a0d6-e528-435a-9126-04d18492d264\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.440089 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.922204 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5"] Oct 06 13:30:26 crc kubenswrapper[4867]: I1006 13:30:26.928824 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:30:27 crc kubenswrapper[4867]: I1006 13:30:27.010354 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" event={"ID":"b214a0d6-e528-435a-9126-04d18492d264","Type":"ContainerStarted","Data":"f5519e89af938819e637520a6f233e04f79434d2449833afb18512851a425186"} Oct 06 13:30:28 crc kubenswrapper[4867]: I1006 13:30:28.021696 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" event={"ID":"b214a0d6-e528-435a-9126-04d18492d264","Type":"ContainerStarted","Data":"783099e80b182017cc7f00c47185a3bbbef2afdd46e02aa362974dafa01ad532"} Oct 06 13:30:28 crc kubenswrapper[4867]: I1006 13:30:28.048086 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" podStartSLOduration=1.4013727550000001 podStartE2EDuration="2.048065931s" podCreationTimestamp="2025-10-06 13:30:26 +0000 UTC" firstStartedPulling="2025-10-06 13:30:26.928448496 +0000 UTC m=+1606.386396660" lastFinishedPulling="2025-10-06 13:30:27.575141692 +0000 UTC m=+1607.033089836" observedRunningTime="2025-10-06 13:30:28.036898276 +0000 UTC m=+1607.494846430" watchObservedRunningTime="2025-10-06 13:30:28.048065931 +0000 UTC m=+1607.506014075" Oct 06 13:30:29 crc kubenswrapper[4867]: I1006 13:30:29.034711 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-2lcbl"] Oct 06 13:30:29 crc kubenswrapper[4867]: I1006 13:30:29.061381 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-5dtwn"] Oct 06 13:30:29 crc kubenswrapper[4867]: I1006 13:30:29.077668 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-9rd88"] Oct 06 13:30:29 crc kubenswrapper[4867]: I1006 13:30:29.093376 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-5dtwn"] Oct 06 13:30:29 crc kubenswrapper[4867]: I1006 13:30:29.108407 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-2lcbl"] Oct 06 13:30:29 crc kubenswrapper[4867]: I1006 13:30:29.118093 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-9rd88"] Oct 06 13:30:29 crc kubenswrapper[4867]: I1006 13:30:29.233028 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad94d2b3-0f12-4bed-82c2-de7289914d0b" path="/var/lib/kubelet/pods/ad94d2b3-0f12-4bed-82c2-de7289914d0b/volumes" Oct 06 13:30:29 crc kubenswrapper[4867]: I1006 13:30:29.233565 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4791fb2-ae33-4758-824f-4b6b7ae9b4ea" path="/var/lib/kubelet/pods/c4791fb2-ae33-4758-824f-4b6b7ae9b4ea/volumes" Oct 06 13:30:29 crc kubenswrapper[4867]: I1006 13:30:29.234080 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab09423-89f0-4694-a961-9813755dfd88" path="/var/lib/kubelet/pods/dab09423-89f0-4694-a961-9813755dfd88/volumes" Oct 06 13:30:31 crc kubenswrapper[4867]: I1006 13:30:31.052678 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-e19f-account-create-f8h75"] Oct 06 13:30:31 crc kubenswrapper[4867]: I1006 13:30:31.064716 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-e19f-account-create-f8h75"] Oct 06 13:30:31 crc kubenswrapper[4867]: I1006 13:30:31.236438 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19b8e9b5-da69-4679-a7e9-471cfcfa7d92" path="/var/lib/kubelet/pods/19b8e9b5-da69-4679-a7e9-471cfcfa7d92/volumes" Oct 06 13:30:39 crc kubenswrapper[4867]: I1006 13:30:39.038853 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-868b-account-create-pg9d2"] Oct 06 13:30:39 crc kubenswrapper[4867]: I1006 13:30:39.050455 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-a51c-account-create-kfp55"] Oct 06 13:30:39 crc kubenswrapper[4867]: I1006 13:30:39.062786 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-868b-account-create-pg9d2"] Oct 06 13:30:39 crc kubenswrapper[4867]: I1006 13:30:39.071338 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-a51c-account-create-kfp55"] Oct 06 13:30:39 crc kubenswrapper[4867]: I1006 13:30:39.232966 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e6482c5-e172-4c9b-820f-e3b7f81435fa" path="/var/lib/kubelet/pods/0e6482c5-e172-4c9b-820f-e3b7f81435fa/volumes" Oct 06 13:30:39 crc kubenswrapper[4867]: I1006 13:30:39.233628 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c62596f3-6c68-47df-9960-c4aa7e5af8fa" path="/var/lib/kubelet/pods/c62596f3-6c68-47df-9960-c4aa7e5af8fa/volumes" Oct 06 13:30:40 crc kubenswrapper[4867]: I1006 13:30:40.028738 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-380c-account-create-ppfbr"] Oct 06 13:30:40 crc kubenswrapper[4867]: I1006 13:30:40.040317 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-380c-account-create-ppfbr"] Oct 06 13:30:41 crc kubenswrapper[4867]: I1006 13:30:41.236634 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8070856-d90e-4aa0-97ca-5d0be29da723" path="/var/lib/kubelet/pods/f8070856-d90e-4aa0-97ca-5d0be29da723/volumes" Oct 06 13:30:42 crc kubenswrapper[4867]: I1006 13:30:42.873744 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:30:42 crc kubenswrapper[4867]: I1006 13:30:42.874326 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:31:00 crc kubenswrapper[4867]: I1006 13:31:00.100634 4867 scope.go:117] "RemoveContainer" containerID="dea6c6f18f0394e7d92ff0808615df1a3ea310e6a7b5b7a995fdd0a3d720d39e" Oct 06 13:31:00 crc kubenswrapper[4867]: I1006 13:31:00.135984 4867 scope.go:117] "RemoveContainer" containerID="fbacba88f8c57c1d7a8b103de6196b99f03d3007da949ac0527df4f2828b959d" Oct 06 13:31:00 crc kubenswrapper[4867]: I1006 13:31:00.163552 4867 scope.go:117] "RemoveContainer" containerID="f2051c4d693a53f5b826a3414556155e042e4c81995d4ff872345a6846a0fe5c" Oct 06 13:31:00 crc kubenswrapper[4867]: I1006 13:31:00.212466 4867 scope.go:117] "RemoveContainer" containerID="817fbad0acbe08775f673490b1d6553adc454f9a7a6ce3e2ee351d7826bdc7d7" Oct 06 13:31:00 crc kubenswrapper[4867]: I1006 13:31:00.259500 4867 scope.go:117] "RemoveContainer" containerID="0ab4e6d31aaf6b5d3dccb9d9267119d8c2e26000f2d36d76f8420ec8d45af974" Oct 06 13:31:00 crc kubenswrapper[4867]: I1006 13:31:00.308043 4867 scope.go:117] "RemoveContainer" containerID="c6f47b4b07a2f6782951b47238e8446a47596715c44e9cb00eca67fadc44ca2c" Oct 06 13:31:00 crc kubenswrapper[4867]: I1006 13:31:00.348704 4867 scope.go:117] "RemoveContainer" containerID="41ae2d4db2e143e487e94bc760ad25f9850fe18b989158e1c111b765da334fd5" Oct 06 13:31:00 crc kubenswrapper[4867]: I1006 13:31:00.392184 4867 scope.go:117] "RemoveContainer" containerID="9e589270dd9f918321d1dd58661bb514f87584f94d108fa7a50be940780a76fc" Oct 06 13:31:00 crc kubenswrapper[4867]: I1006 13:31:00.426367 4867 scope.go:117] "RemoveContainer" containerID="d838f394b26fe381fefd50902d60e1553f7687e92be0da21b7afd485d5d4def6" Oct 06 13:31:00 crc kubenswrapper[4867]: I1006 13:31:00.448958 4867 scope.go:117] "RemoveContainer" containerID="745ca90ca625a445bbf6e09d15c963dd3ec322db3b4e9b48374176ea91fe4375" Oct 06 13:31:05 crc kubenswrapper[4867]: I1006 13:31:05.053502 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-7nb9q"] Oct 06 13:31:05 crc kubenswrapper[4867]: I1006 13:31:05.063630 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-895bn"] Oct 06 13:31:05 crc kubenswrapper[4867]: I1006 13:31:05.072237 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-895bn"] Oct 06 13:31:05 crc kubenswrapper[4867]: I1006 13:31:05.079917 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-7nb9q"] Oct 06 13:31:05 crc kubenswrapper[4867]: I1006 13:31:05.232423 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ed8ecc-126e-40f7-b923-5e26dacacb06" path="/var/lib/kubelet/pods/32ed8ecc-126e-40f7-b923-5e26dacacb06/volumes" Oct 06 13:31:05 crc kubenswrapper[4867]: I1006 13:31:05.235310 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418" path="/var/lib/kubelet/pods/f8fdd1b3-bafe-4fa9-a7c0-ac2d35bb5418/volumes" Oct 06 13:31:06 crc kubenswrapper[4867]: I1006 13:31:06.023968 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-nzl2m"] Oct 06 13:31:06 crc kubenswrapper[4867]: I1006 13:31:06.032033 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-nzl2m"] Oct 06 13:31:07 crc kubenswrapper[4867]: I1006 13:31:07.232936 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="876182a1-a170-4101-b8ce-041060a74555" path="/var/lib/kubelet/pods/876182a1-a170-4101-b8ce-041060a74555/volumes" Oct 06 13:31:12 crc kubenswrapper[4867]: I1006 13:31:12.873952 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:31:12 crc kubenswrapper[4867]: I1006 13:31:12.874431 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:31:14 crc kubenswrapper[4867]: I1006 13:31:14.035100 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-5kc59"] Oct 06 13:31:14 crc kubenswrapper[4867]: I1006 13:31:14.044360 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-5kc59"] Oct 06 13:31:15 crc kubenswrapper[4867]: I1006 13:31:15.233762 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0efadce0-13dd-4a6d-9a54-9deabd1e8069" path="/var/lib/kubelet/pods/0efadce0-13dd-4a6d-9a54-9deabd1e8069/volumes" Oct 06 13:31:16 crc kubenswrapper[4867]: I1006 13:31:16.030454 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ef38-account-create-9qrb7"] Oct 06 13:31:16 crc kubenswrapper[4867]: I1006 13:31:16.039236 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ef38-account-create-9qrb7"] Oct 06 13:31:17 crc kubenswrapper[4867]: I1006 13:31:17.026005 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-53a2-account-create-wfj5z"] Oct 06 13:31:17 crc kubenswrapper[4867]: I1006 13:31:17.034264 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-23c7-account-create-w2ndn"] Oct 06 13:31:17 crc kubenswrapper[4867]: I1006 13:31:17.042757 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-23c7-account-create-w2ndn"] Oct 06 13:31:17 crc kubenswrapper[4867]: I1006 13:31:17.050284 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-53a2-account-create-wfj5z"] Oct 06 13:31:17 crc kubenswrapper[4867]: I1006 13:31:17.232087 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea503d8-6da7-4349-b16c-85e3e66a9f9e" path="/var/lib/kubelet/pods/2ea503d8-6da7-4349-b16c-85e3e66a9f9e/volumes" Oct 06 13:31:17 crc kubenswrapper[4867]: I1006 13:31:17.232762 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cbf821b-c375-4824-95ea-d8774ffb7486" path="/var/lib/kubelet/pods/7cbf821b-c375-4824-95ea-d8774ffb7486/volumes" Oct 06 13:31:17 crc kubenswrapper[4867]: I1006 13:31:17.233328 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3e2c571-fd42-4b41-b31e-4774988cfb31" path="/var/lib/kubelet/pods/b3e2c571-fd42-4b41-b31e-4774988cfb31/volumes" Oct 06 13:31:20 crc kubenswrapper[4867]: I1006 13:31:20.034506 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-jfv8d"] Oct 06 13:31:20 crc kubenswrapper[4867]: I1006 13:31:20.042578 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-9rnv4"] Oct 06 13:31:20 crc kubenswrapper[4867]: I1006 13:31:20.052440 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-9rnv4"] Oct 06 13:31:20 crc kubenswrapper[4867]: I1006 13:31:20.060275 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-jfv8d"] Oct 06 13:31:21 crc kubenswrapper[4867]: I1006 13:31:21.236533 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08af8192-3b42-4ae6-85c1-e12ab46ed88a" path="/var/lib/kubelet/pods/08af8192-3b42-4ae6-85c1-e12ab46ed88a/volumes" Oct 06 13:31:21 crc kubenswrapper[4867]: I1006 13:31:21.237884 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c55d65d-f40c-402b-895b-fdf4000fdf33" path="/var/lib/kubelet/pods/3c55d65d-f40c-402b-895b-fdf4000fdf33/volumes" Oct 06 13:31:42 crc kubenswrapper[4867]: I1006 13:31:42.874356 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:31:42 crc kubenswrapper[4867]: I1006 13:31:42.874970 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:31:42 crc kubenswrapper[4867]: I1006 13:31:42.875021 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:31:42 crc kubenswrapper[4867]: I1006 13:31:42.875856 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:31:42 crc kubenswrapper[4867]: I1006 13:31:42.875913 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" gracePeriod=600 Oct 06 13:31:42 crc kubenswrapper[4867]: E1006 13:31:42.999541 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:31:43 crc kubenswrapper[4867]: I1006 13:31:43.842012 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" exitCode=0 Oct 06 13:31:43 crc kubenswrapper[4867]: I1006 13:31:43.842100 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d"} Oct 06 13:31:43 crc kubenswrapper[4867]: I1006 13:31:43.842594 4867 scope.go:117] "RemoveContainer" containerID="708d16f9a6115595b008bafc5ad0e6ec3528bd438b87bd249255c174238bf7ec" Oct 06 13:31:43 crc kubenswrapper[4867]: I1006 13:31:43.844307 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:31:43 crc kubenswrapper[4867]: E1006 13:31:43.845984 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:31:57 crc kubenswrapper[4867]: I1006 13:31:57.052885 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-js7h4"] Oct 06 13:31:57 crc kubenswrapper[4867]: I1006 13:31:57.066875 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-js7h4"] Oct 06 13:31:57 crc kubenswrapper[4867]: I1006 13:31:57.234522 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82b74021-1aea-4ef5-981a-2b0fc63ec06b" path="/var/lib/kubelet/pods/82b74021-1aea-4ef5-981a-2b0fc63ec06b/volumes" Oct 06 13:31:58 crc kubenswrapper[4867]: I1006 13:31:58.221394 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:31:58 crc kubenswrapper[4867]: E1006 13:31:58.221956 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:32:00 crc kubenswrapper[4867]: I1006 13:32:00.671314 4867 scope.go:117] "RemoveContainer" containerID="5ac16da2ba117970df0cf4e089c407ddb76be0931c4331f7e9e7a92ed5283470" Oct 06 13:32:00 crc kubenswrapper[4867]: I1006 13:32:00.709394 4867 scope.go:117] "RemoveContainer" containerID="3d59006217d14faf12d3123f98b6d12b667aafcdfc0ed2c3dfbcfb9d24cfe57d" Oct 06 13:32:00 crc kubenswrapper[4867]: I1006 13:32:00.745395 4867 scope.go:117] "RemoveContainer" containerID="df56205b40bac95fae4350af2aeb4267392a2786e22910736a310118cc729475" Oct 06 13:32:00 crc kubenswrapper[4867]: I1006 13:32:00.793934 4867 scope.go:117] "RemoveContainer" containerID="e5d268946da403efe40890394102b5786ecef49fb9dd02be60e4a623e31b220d" Oct 06 13:32:00 crc kubenswrapper[4867]: I1006 13:32:00.849873 4867 scope.go:117] "RemoveContainer" containerID="2cd606c92663700319d70ddfe520d03e0d4c8353be64b821591a9e77718e5d30" Oct 06 13:32:00 crc kubenswrapper[4867]: I1006 13:32:00.897741 4867 scope.go:117] "RemoveContainer" containerID="7ebc760aac1ef5691ff1339dd8c13a2c73d505d8fd35a5c013a97e14ac7bd4fd" Oct 06 13:32:00 crc kubenswrapper[4867]: I1006 13:32:00.937835 4867 scope.go:117] "RemoveContainer" containerID="2699f945efc447e5da282252d22177c88eea10e9f7b730286a2b81c9e8b50eb9" Oct 06 13:32:00 crc kubenswrapper[4867]: I1006 13:32:00.962509 4867 scope.go:117] "RemoveContainer" containerID="4d2f7ac48e5358987e58bbefe0b0670e046ee3876cc0bea663dab58bf9152451" Oct 06 13:32:00 crc kubenswrapper[4867]: I1006 13:32:00.996029 4867 scope.go:117] "RemoveContainer" containerID="92ba3ad6abe82e07d130480afe543510a84824dbb9d6e46c52ba0429207ec901" Oct 06 13:32:01 crc kubenswrapper[4867]: I1006 13:32:01.039718 4867 scope.go:117] "RemoveContainer" containerID="adf04b731cde07ae684ec0a4ffb7eed577de8c4d9b3e127e57dabb10f30c79d5" Oct 06 13:32:04 crc kubenswrapper[4867]: I1006 13:32:04.093237 4867 generic.go:334] "Generic (PLEG): container finished" podID="b214a0d6-e528-435a-9126-04d18492d264" containerID="783099e80b182017cc7f00c47185a3bbbef2afdd46e02aa362974dafa01ad532" exitCode=0 Oct 06 13:32:04 crc kubenswrapper[4867]: I1006 13:32:04.093315 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" event={"ID":"b214a0d6-e528-435a-9126-04d18492d264","Type":"ContainerDied","Data":"783099e80b182017cc7f00c47185a3bbbef2afdd46e02aa362974dafa01ad532"} Oct 06 13:32:05 crc kubenswrapper[4867]: I1006 13:32:05.512848 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" Oct 06 13:32:05 crc kubenswrapper[4867]: I1006 13:32:05.620418 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5v88\" (UniqueName: \"kubernetes.io/projected/b214a0d6-e528-435a-9126-04d18492d264-kube-api-access-w5v88\") pod \"b214a0d6-e528-435a-9126-04d18492d264\" (UID: \"b214a0d6-e528-435a-9126-04d18492d264\") " Oct 06 13:32:05 crc kubenswrapper[4867]: I1006 13:32:05.620721 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b214a0d6-e528-435a-9126-04d18492d264-ssh-key\") pod \"b214a0d6-e528-435a-9126-04d18492d264\" (UID: \"b214a0d6-e528-435a-9126-04d18492d264\") " Oct 06 13:32:05 crc kubenswrapper[4867]: I1006 13:32:05.620770 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b214a0d6-e528-435a-9126-04d18492d264-inventory\") pod \"b214a0d6-e528-435a-9126-04d18492d264\" (UID: \"b214a0d6-e528-435a-9126-04d18492d264\") " Oct 06 13:32:05 crc kubenswrapper[4867]: I1006 13:32:05.626560 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b214a0d6-e528-435a-9126-04d18492d264-kube-api-access-w5v88" (OuterVolumeSpecName: "kube-api-access-w5v88") pod "b214a0d6-e528-435a-9126-04d18492d264" (UID: "b214a0d6-e528-435a-9126-04d18492d264"). InnerVolumeSpecName "kube-api-access-w5v88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:32:05 crc kubenswrapper[4867]: I1006 13:32:05.652582 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b214a0d6-e528-435a-9126-04d18492d264-inventory" (OuterVolumeSpecName: "inventory") pod "b214a0d6-e528-435a-9126-04d18492d264" (UID: "b214a0d6-e528-435a-9126-04d18492d264"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:32:05 crc kubenswrapper[4867]: I1006 13:32:05.671448 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b214a0d6-e528-435a-9126-04d18492d264-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b214a0d6-e528-435a-9126-04d18492d264" (UID: "b214a0d6-e528-435a-9126-04d18492d264"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:32:05 crc kubenswrapper[4867]: I1006 13:32:05.723225 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5v88\" (UniqueName: \"kubernetes.io/projected/b214a0d6-e528-435a-9126-04d18492d264-kube-api-access-w5v88\") on node \"crc\" DevicePath \"\"" Oct 06 13:32:05 crc kubenswrapper[4867]: I1006 13:32:05.723265 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b214a0d6-e528-435a-9126-04d18492d264-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:32:05 crc kubenswrapper[4867]: I1006 13:32:05.723274 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b214a0d6-e528-435a-9126-04d18492d264-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.113074 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" event={"ID":"b214a0d6-e528-435a-9126-04d18492d264","Type":"ContainerDied","Data":"f5519e89af938819e637520a6f233e04f79434d2449833afb18512851a425186"} Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.113377 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5519e89af938819e637520a6f233e04f79434d2449833afb18512851a425186" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.113118 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.189513 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg"] Oct 06 13:32:06 crc kubenswrapper[4867]: E1006 13:32:06.190273 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b214a0d6-e528-435a-9126-04d18492d264" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.190377 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b214a0d6-e528-435a-9126-04d18492d264" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.190743 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b214a0d6-e528-435a-9126-04d18492d264" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.191798 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.193729 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.193908 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.194527 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.194694 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.204038 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg"] Oct 06 13:32:06 crc kubenswrapper[4867]: E1006 13:32:06.304813 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb214a0d6_e528_435a_9126_04d18492d264.slice\": RecentStats: unable to find data in memory cache]" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.335246 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68b10f2c-285a-4492-90c0-1a3d83ab46e7-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg\" (UID: \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.336045 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9zkq\" (UniqueName: \"kubernetes.io/projected/68b10f2c-285a-4492-90c0-1a3d83ab46e7-kube-api-access-m9zkq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg\" (UID: \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.337138 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68b10f2c-285a-4492-90c0-1a3d83ab46e7-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg\" (UID: \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.439390 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68b10f2c-285a-4492-90c0-1a3d83ab46e7-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg\" (UID: \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.439490 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68b10f2c-285a-4492-90c0-1a3d83ab46e7-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg\" (UID: \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.439599 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9zkq\" (UniqueName: \"kubernetes.io/projected/68b10f2c-285a-4492-90c0-1a3d83ab46e7-kube-api-access-m9zkq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg\" (UID: \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.445806 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68b10f2c-285a-4492-90c0-1a3d83ab46e7-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg\" (UID: \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.448612 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68b10f2c-285a-4492-90c0-1a3d83ab46e7-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg\" (UID: \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.457617 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9zkq\" (UniqueName: \"kubernetes.io/projected/68b10f2c-285a-4492-90c0-1a3d83ab46e7-kube-api-access-m9zkq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg\" (UID: \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" Oct 06 13:32:06 crc kubenswrapper[4867]: I1006 13:32:06.529196 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" Oct 06 13:32:07 crc kubenswrapper[4867]: I1006 13:32:07.057696 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-d8hp7"] Oct 06 13:32:07 crc kubenswrapper[4867]: I1006 13:32:07.071551 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-d8hp7"] Oct 06 13:32:07 crc kubenswrapper[4867]: I1006 13:32:07.086508 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg"] Oct 06 13:32:07 crc kubenswrapper[4867]: I1006 13:32:07.126551 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" event={"ID":"68b10f2c-285a-4492-90c0-1a3d83ab46e7","Type":"ContainerStarted","Data":"d76d17e90c8b193b0a01d8ebd8ced92c981e81b409cb9249fe93619cc375287a"} Oct 06 13:32:07 crc kubenswrapper[4867]: I1006 13:32:07.232492 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f59ab79-d706-4e1f-9361-6efea6b85568" path="/var/lib/kubelet/pods/4f59ab79-d706-4e1f-9361-6efea6b85568/volumes" Oct 06 13:32:08 crc kubenswrapper[4867]: I1006 13:32:08.137118 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" event={"ID":"68b10f2c-285a-4492-90c0-1a3d83ab46e7","Type":"ContainerStarted","Data":"f7db873c357eef9105393d0973a75742c37008136b6fb852f25caa6d9bd56da0"} Oct 06 13:32:08 crc kubenswrapper[4867]: I1006 13:32:08.153921 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" podStartSLOduration=1.649937774 podStartE2EDuration="2.15390599s" podCreationTimestamp="2025-10-06 13:32:06 +0000 UTC" firstStartedPulling="2025-10-06 13:32:07.097823121 +0000 UTC m=+1706.555771265" lastFinishedPulling="2025-10-06 13:32:07.601791337 +0000 UTC m=+1707.059739481" observedRunningTime="2025-10-06 13:32:08.151931186 +0000 UTC m=+1707.609879340" watchObservedRunningTime="2025-10-06 13:32:08.15390599 +0000 UTC m=+1707.611854134" Oct 06 13:32:11 crc kubenswrapper[4867]: I1006 13:32:11.235302 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:32:11 crc kubenswrapper[4867]: E1006 13:32:11.236186 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:32:12 crc kubenswrapper[4867]: I1006 13:32:12.032544 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-plcg5"] Oct 06 13:32:12 crc kubenswrapper[4867]: I1006 13:32:12.039458 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-plcg5"] Oct 06 13:32:13 crc kubenswrapper[4867]: I1006 13:32:13.242604 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255" path="/var/lib/kubelet/pods/9a0e3c4c-b8d5-43fc-8d6d-a8f6bc957255/volumes" Oct 06 13:32:23 crc kubenswrapper[4867]: I1006 13:32:23.028531 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-4vzlg"] Oct 06 13:32:23 crc kubenswrapper[4867]: I1006 13:32:23.038070 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-4vzlg"] Oct 06 13:32:23 crc kubenswrapper[4867]: I1006 13:32:23.234553 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ab90007-6383-4ff2-97cc-edb5d7d13d1e" path="/var/lib/kubelet/pods/1ab90007-6383-4ff2-97cc-edb5d7d13d1e/volumes" Oct 06 13:32:24 crc kubenswrapper[4867]: I1006 13:32:24.221846 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:32:24 crc kubenswrapper[4867]: E1006 13:32:24.222181 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:32:26 crc kubenswrapper[4867]: I1006 13:32:26.026829 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-m8h8z"] Oct 06 13:32:26 crc kubenswrapper[4867]: I1006 13:32:26.034521 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-m8h8z"] Oct 06 13:32:27 crc kubenswrapper[4867]: I1006 13:32:27.241687 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69ef6013-a982-45c6-8fc8-46c11fead4a7" path="/var/lib/kubelet/pods/69ef6013-a982-45c6-8fc8-46c11fead4a7/volumes" Oct 06 13:32:27 crc kubenswrapper[4867]: I1006 13:32:27.429202 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zrgdx"] Oct 06 13:32:27 crc kubenswrapper[4867]: I1006 13:32:27.433405 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:27 crc kubenswrapper[4867]: I1006 13:32:27.438978 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrgdx"] Oct 06 13:32:27 crc kubenswrapper[4867]: I1006 13:32:27.582545 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gt5c\" (UniqueName: \"kubernetes.io/projected/45a06fd1-a941-427d-b692-e63ea3a3d44a-kube-api-access-2gt5c\") pod \"redhat-marketplace-zrgdx\" (UID: \"45a06fd1-a941-427d-b692-e63ea3a3d44a\") " pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:27 crc kubenswrapper[4867]: I1006 13:32:27.582689 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a06fd1-a941-427d-b692-e63ea3a3d44a-catalog-content\") pod \"redhat-marketplace-zrgdx\" (UID: \"45a06fd1-a941-427d-b692-e63ea3a3d44a\") " pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:27 crc kubenswrapper[4867]: I1006 13:32:27.582731 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a06fd1-a941-427d-b692-e63ea3a3d44a-utilities\") pod \"redhat-marketplace-zrgdx\" (UID: \"45a06fd1-a941-427d-b692-e63ea3a3d44a\") " pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:27 crc kubenswrapper[4867]: I1006 13:32:27.685174 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a06fd1-a941-427d-b692-e63ea3a3d44a-catalog-content\") pod \"redhat-marketplace-zrgdx\" (UID: \"45a06fd1-a941-427d-b692-e63ea3a3d44a\") " pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:27 crc kubenswrapper[4867]: I1006 13:32:27.685268 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a06fd1-a941-427d-b692-e63ea3a3d44a-utilities\") pod \"redhat-marketplace-zrgdx\" (UID: \"45a06fd1-a941-427d-b692-e63ea3a3d44a\") " pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:27 crc kubenswrapper[4867]: I1006 13:32:27.685366 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gt5c\" (UniqueName: \"kubernetes.io/projected/45a06fd1-a941-427d-b692-e63ea3a3d44a-kube-api-access-2gt5c\") pod \"redhat-marketplace-zrgdx\" (UID: \"45a06fd1-a941-427d-b692-e63ea3a3d44a\") " pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:27 crc kubenswrapper[4867]: I1006 13:32:27.685888 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a06fd1-a941-427d-b692-e63ea3a3d44a-catalog-content\") pod \"redhat-marketplace-zrgdx\" (UID: \"45a06fd1-a941-427d-b692-e63ea3a3d44a\") " pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:27 crc kubenswrapper[4867]: I1006 13:32:27.686076 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a06fd1-a941-427d-b692-e63ea3a3d44a-utilities\") pod \"redhat-marketplace-zrgdx\" (UID: \"45a06fd1-a941-427d-b692-e63ea3a3d44a\") " pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:27 crc kubenswrapper[4867]: I1006 13:32:27.706661 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gt5c\" (UniqueName: \"kubernetes.io/projected/45a06fd1-a941-427d-b692-e63ea3a3d44a-kube-api-access-2gt5c\") pod \"redhat-marketplace-zrgdx\" (UID: \"45a06fd1-a941-427d-b692-e63ea3a3d44a\") " pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:27 crc kubenswrapper[4867]: I1006 13:32:27.765387 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:28 crc kubenswrapper[4867]: I1006 13:32:28.223449 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrgdx"] Oct 06 13:32:28 crc kubenswrapper[4867]: W1006 13:32:28.231395 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a06fd1_a941_427d_b692_e63ea3a3d44a.slice/crio-567a4af59058b3e8332134a4ff57dbad9307c831b454c022bec5ed048a1a0d63 WatchSource:0}: Error finding container 567a4af59058b3e8332134a4ff57dbad9307c831b454c022bec5ed048a1a0d63: Status 404 returned error can't find the container with id 567a4af59058b3e8332134a4ff57dbad9307c831b454c022bec5ed048a1a0d63 Oct 06 13:32:28 crc kubenswrapper[4867]: I1006 13:32:28.322414 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrgdx" event={"ID":"45a06fd1-a941-427d-b692-e63ea3a3d44a","Type":"ContainerStarted","Data":"567a4af59058b3e8332134a4ff57dbad9307c831b454c022bec5ed048a1a0d63"} Oct 06 13:32:29 crc kubenswrapper[4867]: I1006 13:32:29.337541 4867 generic.go:334] "Generic (PLEG): container finished" podID="45a06fd1-a941-427d-b692-e63ea3a3d44a" containerID="2f435f6d8d43c662a4e737c64b27feb2b6afc0ce0272905eaa67a52acf26c098" exitCode=0 Oct 06 13:32:29 crc kubenswrapper[4867]: I1006 13:32:29.337652 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrgdx" event={"ID":"45a06fd1-a941-427d-b692-e63ea3a3d44a","Type":"ContainerDied","Data":"2f435f6d8d43c662a4e737c64b27feb2b6afc0ce0272905eaa67a52acf26c098"} Oct 06 13:32:30 crc kubenswrapper[4867]: I1006 13:32:30.349917 4867 generic.go:334] "Generic (PLEG): container finished" podID="45a06fd1-a941-427d-b692-e63ea3a3d44a" containerID="86cca3a854bbcf92a31d0be04b27570c18bb2b95e8b31e55612959bcd9ff72b3" exitCode=0 Oct 06 13:32:30 crc kubenswrapper[4867]: I1006 13:32:30.349963 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrgdx" event={"ID":"45a06fd1-a941-427d-b692-e63ea3a3d44a","Type":"ContainerDied","Data":"86cca3a854bbcf92a31d0be04b27570c18bb2b95e8b31e55612959bcd9ff72b3"} Oct 06 13:32:31 crc kubenswrapper[4867]: I1006 13:32:31.362588 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrgdx" event={"ID":"45a06fd1-a941-427d-b692-e63ea3a3d44a","Type":"ContainerStarted","Data":"8f80698e0b8c7e8b68d9c176596f17411e0c40cab4766b92850fd059120a7679"} Oct 06 13:32:31 crc kubenswrapper[4867]: I1006 13:32:31.381836 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zrgdx" podStartSLOduration=2.9121260490000003 podStartE2EDuration="4.381814848s" podCreationTimestamp="2025-10-06 13:32:27 +0000 UTC" firstStartedPulling="2025-10-06 13:32:29.340151593 +0000 UTC m=+1728.798099737" lastFinishedPulling="2025-10-06 13:32:30.809840392 +0000 UTC m=+1730.267788536" observedRunningTime="2025-10-06 13:32:31.380912073 +0000 UTC m=+1730.838860257" watchObservedRunningTime="2025-10-06 13:32:31.381814848 +0000 UTC m=+1730.839763002" Oct 06 13:32:37 crc kubenswrapper[4867]: I1006 13:32:37.222779 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:32:37 crc kubenswrapper[4867]: E1006 13:32:37.223609 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:32:37 crc kubenswrapper[4867]: I1006 13:32:37.766399 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:37 crc kubenswrapper[4867]: I1006 13:32:37.766730 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:37 crc kubenswrapper[4867]: I1006 13:32:37.837864 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:38 crc kubenswrapper[4867]: I1006 13:32:38.475854 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:38 crc kubenswrapper[4867]: I1006 13:32:38.532072 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrgdx"] Oct 06 13:32:40 crc kubenswrapper[4867]: I1006 13:32:40.450531 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zrgdx" podUID="45a06fd1-a941-427d-b692-e63ea3a3d44a" containerName="registry-server" containerID="cri-o://8f80698e0b8c7e8b68d9c176596f17411e0c40cab4766b92850fd059120a7679" gracePeriod=2 Oct 06 13:32:40 crc kubenswrapper[4867]: I1006 13:32:40.925783 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.074801 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a06fd1-a941-427d-b692-e63ea3a3d44a-catalog-content\") pod \"45a06fd1-a941-427d-b692-e63ea3a3d44a\" (UID: \"45a06fd1-a941-427d-b692-e63ea3a3d44a\") " Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.074908 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gt5c\" (UniqueName: \"kubernetes.io/projected/45a06fd1-a941-427d-b692-e63ea3a3d44a-kube-api-access-2gt5c\") pod \"45a06fd1-a941-427d-b692-e63ea3a3d44a\" (UID: \"45a06fd1-a941-427d-b692-e63ea3a3d44a\") " Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.075014 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a06fd1-a941-427d-b692-e63ea3a3d44a-utilities\") pod \"45a06fd1-a941-427d-b692-e63ea3a3d44a\" (UID: \"45a06fd1-a941-427d-b692-e63ea3a3d44a\") " Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.075755 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45a06fd1-a941-427d-b692-e63ea3a3d44a-utilities" (OuterVolumeSpecName: "utilities") pod "45a06fd1-a941-427d-b692-e63ea3a3d44a" (UID: "45a06fd1-a941-427d-b692-e63ea3a3d44a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.075888 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a06fd1-a941-427d-b692-e63ea3a3d44a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.083602 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45a06fd1-a941-427d-b692-e63ea3a3d44a-kube-api-access-2gt5c" (OuterVolumeSpecName: "kube-api-access-2gt5c") pod "45a06fd1-a941-427d-b692-e63ea3a3d44a" (UID: "45a06fd1-a941-427d-b692-e63ea3a3d44a"). InnerVolumeSpecName "kube-api-access-2gt5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.090068 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45a06fd1-a941-427d-b692-e63ea3a3d44a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45a06fd1-a941-427d-b692-e63ea3a3d44a" (UID: "45a06fd1-a941-427d-b692-e63ea3a3d44a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.177492 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a06fd1-a941-427d-b692-e63ea3a3d44a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.177531 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gt5c\" (UniqueName: \"kubernetes.io/projected/45a06fd1-a941-427d-b692-e63ea3a3d44a-kube-api-access-2gt5c\") on node \"crc\" DevicePath \"\"" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.462358 4867 generic.go:334] "Generic (PLEG): container finished" podID="45a06fd1-a941-427d-b692-e63ea3a3d44a" containerID="8f80698e0b8c7e8b68d9c176596f17411e0c40cab4766b92850fd059120a7679" exitCode=0 Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.462415 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrgdx" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.462415 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrgdx" event={"ID":"45a06fd1-a941-427d-b692-e63ea3a3d44a","Type":"ContainerDied","Data":"8f80698e0b8c7e8b68d9c176596f17411e0c40cab4766b92850fd059120a7679"} Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.462542 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrgdx" event={"ID":"45a06fd1-a941-427d-b692-e63ea3a3d44a","Type":"ContainerDied","Data":"567a4af59058b3e8332134a4ff57dbad9307c831b454c022bec5ed048a1a0d63"} Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.462564 4867 scope.go:117] "RemoveContainer" containerID="8f80698e0b8c7e8b68d9c176596f17411e0c40cab4766b92850fd059120a7679" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.489932 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrgdx"] Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.492614 4867 scope.go:117] "RemoveContainer" containerID="86cca3a854bbcf92a31d0be04b27570c18bb2b95e8b31e55612959bcd9ff72b3" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.500105 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrgdx"] Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.514723 4867 scope.go:117] "RemoveContainer" containerID="2f435f6d8d43c662a4e737c64b27feb2b6afc0ce0272905eaa67a52acf26c098" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.560596 4867 scope.go:117] "RemoveContainer" containerID="8f80698e0b8c7e8b68d9c176596f17411e0c40cab4766b92850fd059120a7679" Oct 06 13:32:41 crc kubenswrapper[4867]: E1006 13:32:41.561102 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f80698e0b8c7e8b68d9c176596f17411e0c40cab4766b92850fd059120a7679\": container with ID starting with 8f80698e0b8c7e8b68d9c176596f17411e0c40cab4766b92850fd059120a7679 not found: ID does not exist" containerID="8f80698e0b8c7e8b68d9c176596f17411e0c40cab4766b92850fd059120a7679" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.561154 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f80698e0b8c7e8b68d9c176596f17411e0c40cab4766b92850fd059120a7679"} err="failed to get container status \"8f80698e0b8c7e8b68d9c176596f17411e0c40cab4766b92850fd059120a7679\": rpc error: code = NotFound desc = could not find container \"8f80698e0b8c7e8b68d9c176596f17411e0c40cab4766b92850fd059120a7679\": container with ID starting with 8f80698e0b8c7e8b68d9c176596f17411e0c40cab4766b92850fd059120a7679 not found: ID does not exist" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.561186 4867 scope.go:117] "RemoveContainer" containerID="86cca3a854bbcf92a31d0be04b27570c18bb2b95e8b31e55612959bcd9ff72b3" Oct 06 13:32:41 crc kubenswrapper[4867]: E1006 13:32:41.561631 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86cca3a854bbcf92a31d0be04b27570c18bb2b95e8b31e55612959bcd9ff72b3\": container with ID starting with 86cca3a854bbcf92a31d0be04b27570c18bb2b95e8b31e55612959bcd9ff72b3 not found: ID does not exist" containerID="86cca3a854bbcf92a31d0be04b27570c18bb2b95e8b31e55612959bcd9ff72b3" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.561663 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86cca3a854bbcf92a31d0be04b27570c18bb2b95e8b31e55612959bcd9ff72b3"} err="failed to get container status \"86cca3a854bbcf92a31d0be04b27570c18bb2b95e8b31e55612959bcd9ff72b3\": rpc error: code = NotFound desc = could not find container \"86cca3a854bbcf92a31d0be04b27570c18bb2b95e8b31e55612959bcd9ff72b3\": container with ID starting with 86cca3a854bbcf92a31d0be04b27570c18bb2b95e8b31e55612959bcd9ff72b3 not found: ID does not exist" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.561688 4867 scope.go:117] "RemoveContainer" containerID="2f435f6d8d43c662a4e737c64b27feb2b6afc0ce0272905eaa67a52acf26c098" Oct 06 13:32:41 crc kubenswrapper[4867]: E1006 13:32:41.562215 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f435f6d8d43c662a4e737c64b27feb2b6afc0ce0272905eaa67a52acf26c098\": container with ID starting with 2f435f6d8d43c662a4e737c64b27feb2b6afc0ce0272905eaa67a52acf26c098 not found: ID does not exist" containerID="2f435f6d8d43c662a4e737c64b27feb2b6afc0ce0272905eaa67a52acf26c098" Oct 06 13:32:41 crc kubenswrapper[4867]: I1006 13:32:41.562239 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f435f6d8d43c662a4e737c64b27feb2b6afc0ce0272905eaa67a52acf26c098"} err="failed to get container status \"2f435f6d8d43c662a4e737c64b27feb2b6afc0ce0272905eaa67a52acf26c098\": rpc error: code = NotFound desc = could not find container \"2f435f6d8d43c662a4e737c64b27feb2b6afc0ce0272905eaa67a52acf26c098\": container with ID starting with 2f435f6d8d43c662a4e737c64b27feb2b6afc0ce0272905eaa67a52acf26c098 not found: ID does not exist" Oct 06 13:32:43 crc kubenswrapper[4867]: I1006 13:32:43.236837 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45a06fd1-a941-427d-b692-e63ea3a3d44a" path="/var/lib/kubelet/pods/45a06fd1-a941-427d-b692-e63ea3a3d44a/volumes" Oct 06 13:32:47 crc kubenswrapper[4867]: E1006 13:32:47.434209 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a06fd1_a941_427d_b692_e63ea3a3d44a.slice/crio-567a4af59058b3e8332134a4ff57dbad9307c831b454c022bec5ed048a1a0d63\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a06fd1_a941_427d_b692_e63ea3a3d44a.slice\": RecentStats: unable to find data in memory cache]" Oct 06 13:32:50 crc kubenswrapper[4867]: I1006 13:32:50.222470 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:32:50 crc kubenswrapper[4867]: E1006 13:32:50.223243 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:32:57 crc kubenswrapper[4867]: E1006 13:32:57.696904 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a06fd1_a941_427d_b692_e63ea3a3d44a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a06fd1_a941_427d_b692_e63ea3a3d44a.slice/crio-567a4af59058b3e8332134a4ff57dbad9307c831b454c022bec5ed048a1a0d63\": RecentStats: unable to find data in memory cache]" Oct 06 13:33:01 crc kubenswrapper[4867]: I1006 13:33:01.246142 4867 scope.go:117] "RemoveContainer" containerID="b156a56991af149be07f68a2d46a2d3973ab6eb4e2e2b1fb6f24753c0d2e173a" Oct 06 13:33:01 crc kubenswrapper[4867]: I1006 13:33:01.291419 4867 scope.go:117] "RemoveContainer" containerID="cad0d3e4e6941aba5ca0359acf556a3542d67087e60b8a29f95c9045523ff4e6" Oct 06 13:33:01 crc kubenswrapper[4867]: I1006 13:33:01.340165 4867 scope.go:117] "RemoveContainer" containerID="bf267a34fc01a7e04ddf7818b14c07aaa19939c20716b1b7141e26d95cb6957f" Oct 06 13:33:01 crc kubenswrapper[4867]: I1006 13:33:01.393574 4867 scope.go:117] "RemoveContainer" containerID="aa1955bd06810fbd441cf73abe03679e07d67298e6bb63ad9ce5865b8cd5d85d" Oct 06 13:33:05 crc kubenswrapper[4867]: I1006 13:33:05.221946 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:33:05 crc kubenswrapper[4867]: E1006 13:33:05.222598 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:33:08 crc kubenswrapper[4867]: E1006 13:33:08.000221 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a06fd1_a941_427d_b692_e63ea3a3d44a.slice/crio-567a4af59058b3e8332134a4ff57dbad9307c831b454c022bec5ed048a1a0d63\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a06fd1_a941_427d_b692_e63ea3a3d44a.slice\": RecentStats: unable to find data in memory cache]" Oct 06 13:33:17 crc kubenswrapper[4867]: I1006 13:33:17.221703 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:33:17 crc kubenswrapper[4867]: E1006 13:33:17.222946 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:33:18 crc kubenswrapper[4867]: E1006 13:33:18.276382 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a06fd1_a941_427d_b692_e63ea3a3d44a.slice/crio-567a4af59058b3e8332134a4ff57dbad9307c831b454c022bec5ed048a1a0d63\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a06fd1_a941_427d_b692_e63ea3a3d44a.slice\": RecentStats: unable to find data in memory cache]" Oct 06 13:33:20 crc kubenswrapper[4867]: I1006 13:33:20.869031 4867 generic.go:334] "Generic (PLEG): container finished" podID="68b10f2c-285a-4492-90c0-1a3d83ab46e7" containerID="f7db873c357eef9105393d0973a75742c37008136b6fb852f25caa6d9bd56da0" exitCode=0 Oct 06 13:33:20 crc kubenswrapper[4867]: I1006 13:33:20.869090 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" event={"ID":"68b10f2c-285a-4492-90c0-1a3d83ab46e7","Type":"ContainerDied","Data":"f7db873c357eef9105393d0973a75742c37008136b6fb852f25caa6d9bd56da0"} Oct 06 13:33:21 crc kubenswrapper[4867]: I1006 13:33:21.051539 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-mvq2j"] Oct 06 13:33:21 crc kubenswrapper[4867]: I1006 13:33:21.061585 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-xpp6d"] Oct 06 13:33:21 crc kubenswrapper[4867]: I1006 13:33:21.072476 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-jxwfj"] Oct 06 13:33:21 crc kubenswrapper[4867]: I1006 13:33:21.082731 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-jxwfj"] Oct 06 13:33:21 crc kubenswrapper[4867]: I1006 13:33:21.092509 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-mvq2j"] Oct 06 13:33:21 crc kubenswrapper[4867]: I1006 13:33:21.102201 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-xpp6d"] Oct 06 13:33:21 crc kubenswrapper[4867]: I1006 13:33:21.231015 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f1a49ca-4178-4e3b-958e-a766b5087e59" path="/var/lib/kubelet/pods/3f1a49ca-4178-4e3b-958e-a766b5087e59/volumes" Oct 06 13:33:21 crc kubenswrapper[4867]: I1006 13:33:21.231709 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8" path="/var/lib/kubelet/pods/7efcdadc-ad8e-4c32-8c7d-7c3e387d2cc8/volumes" Oct 06 13:33:21 crc kubenswrapper[4867]: I1006 13:33:21.232338 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb523789-87d3-4cd0-8047-a431894a91ff" path="/var/lib/kubelet/pods/cb523789-87d3-4cd0-8047-a431894a91ff/volumes" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.328872 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.496210 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68b10f2c-285a-4492-90c0-1a3d83ab46e7-ssh-key\") pod \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\" (UID: \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\") " Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.496565 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68b10f2c-285a-4492-90c0-1a3d83ab46e7-inventory\") pod \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\" (UID: \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\") " Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.496773 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9zkq\" (UniqueName: \"kubernetes.io/projected/68b10f2c-285a-4492-90c0-1a3d83ab46e7-kube-api-access-m9zkq\") pod \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\" (UID: \"68b10f2c-285a-4492-90c0-1a3d83ab46e7\") " Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.504363 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b10f2c-285a-4492-90c0-1a3d83ab46e7-kube-api-access-m9zkq" (OuterVolumeSpecName: "kube-api-access-m9zkq") pod "68b10f2c-285a-4492-90c0-1a3d83ab46e7" (UID: "68b10f2c-285a-4492-90c0-1a3d83ab46e7"). InnerVolumeSpecName "kube-api-access-m9zkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.530964 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b10f2c-285a-4492-90c0-1a3d83ab46e7-inventory" (OuterVolumeSpecName: "inventory") pod "68b10f2c-285a-4492-90c0-1a3d83ab46e7" (UID: "68b10f2c-285a-4492-90c0-1a3d83ab46e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.532144 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b10f2c-285a-4492-90c0-1a3d83ab46e7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "68b10f2c-285a-4492-90c0-1a3d83ab46e7" (UID: "68b10f2c-285a-4492-90c0-1a3d83ab46e7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.599161 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9zkq\" (UniqueName: \"kubernetes.io/projected/68b10f2c-285a-4492-90c0-1a3d83ab46e7-kube-api-access-m9zkq\") on node \"crc\" DevicePath \"\"" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.599445 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68b10f2c-285a-4492-90c0-1a3d83ab46e7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.599508 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68b10f2c-285a-4492-90c0-1a3d83ab46e7-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.890154 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" event={"ID":"68b10f2c-285a-4492-90c0-1a3d83ab46e7","Type":"ContainerDied","Data":"d76d17e90c8b193b0a01d8ebd8ced92c981e81b409cb9249fe93619cc375287a"} Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.890198 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d76d17e90c8b193b0a01d8ebd8ced92c981e81b409cb9249fe93619cc375287a" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.890669 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.997174 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k"] Oct 06 13:33:22 crc kubenswrapper[4867]: E1006 13:33:22.997601 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a06fd1-a941-427d-b692-e63ea3a3d44a" containerName="registry-server" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.997619 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a06fd1-a941-427d-b692-e63ea3a3d44a" containerName="registry-server" Oct 06 13:33:22 crc kubenswrapper[4867]: E1006 13:33:22.997649 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b10f2c-285a-4492-90c0-1a3d83ab46e7" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.997656 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b10f2c-285a-4492-90c0-1a3d83ab46e7" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 13:33:22 crc kubenswrapper[4867]: E1006 13:33:22.997674 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a06fd1-a941-427d-b692-e63ea3a3d44a" containerName="extract-utilities" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.997679 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a06fd1-a941-427d-b692-e63ea3a3d44a" containerName="extract-utilities" Oct 06 13:33:22 crc kubenswrapper[4867]: E1006 13:33:22.997696 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a06fd1-a941-427d-b692-e63ea3a3d44a" containerName="extract-content" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.997702 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a06fd1-a941-427d-b692-e63ea3a3d44a" containerName="extract-content" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.997882 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b10f2c-285a-4492-90c0-1a3d83ab46e7" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.997906 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="45a06fd1-a941-427d-b692-e63ea3a3d44a" containerName="registry-server" Oct 06 13:33:22 crc kubenswrapper[4867]: I1006 13:33:22.998590 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.002198 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.003234 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.003326 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.003393 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.012124 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k"] Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.120881 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c45ecf4-2135-407f-ab03-6c1571cd3f76-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k\" (UID: \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.120954 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kztm\" (UniqueName: \"kubernetes.io/projected/4c45ecf4-2135-407f-ab03-6c1571cd3f76-kube-api-access-4kztm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k\" (UID: \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.121019 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c45ecf4-2135-407f-ab03-6c1571cd3f76-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k\" (UID: \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.223173 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c45ecf4-2135-407f-ab03-6c1571cd3f76-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k\" (UID: \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.223223 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kztm\" (UniqueName: \"kubernetes.io/projected/4c45ecf4-2135-407f-ab03-6c1571cd3f76-kube-api-access-4kztm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k\" (UID: \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.223308 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c45ecf4-2135-407f-ab03-6c1571cd3f76-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k\" (UID: \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.226625 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c45ecf4-2135-407f-ab03-6c1571cd3f76-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k\" (UID: \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.226811 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c45ecf4-2135-407f-ab03-6c1571cd3f76-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k\" (UID: \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.244655 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kztm\" (UniqueName: \"kubernetes.io/projected/4c45ecf4-2135-407f-ab03-6c1571cd3f76-kube-api-access-4kztm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k\" (UID: \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.327052 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" Oct 06 13:33:23 crc kubenswrapper[4867]: I1006 13:33:23.932731 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k"] Oct 06 13:33:24 crc kubenswrapper[4867]: I1006 13:33:24.913790 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" event={"ID":"4c45ecf4-2135-407f-ab03-6c1571cd3f76","Type":"ContainerStarted","Data":"464d87b8c990415a1642baa673b53891f5c07fdcef59bf066a76594bfaec8640"} Oct 06 13:33:24 crc kubenswrapper[4867]: I1006 13:33:24.914193 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" event={"ID":"4c45ecf4-2135-407f-ab03-6c1571cd3f76","Type":"ContainerStarted","Data":"554d202e54dfe79f3e2831641ae23f4ed157f9cc8a85e775c9127bfa14b07fbb"} Oct 06 13:33:24 crc kubenswrapper[4867]: I1006 13:33:24.945223 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" podStartSLOduration=2.334830467 podStartE2EDuration="2.945186641s" podCreationTimestamp="2025-10-06 13:33:22 +0000 UTC" firstStartedPulling="2025-10-06 13:33:23.927450538 +0000 UTC m=+1783.385398682" lastFinishedPulling="2025-10-06 13:33:24.537806712 +0000 UTC m=+1783.995754856" observedRunningTime="2025-10-06 13:33:24.934204071 +0000 UTC m=+1784.392152225" watchObservedRunningTime="2025-10-06 13:33:24.945186641 +0000 UTC m=+1784.403134785" Oct 06 13:33:28 crc kubenswrapper[4867]: I1006 13:33:28.229392 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:33:28 crc kubenswrapper[4867]: E1006 13:33:28.230471 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:33:28 crc kubenswrapper[4867]: E1006 13:33:28.538534 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a06fd1_a941_427d_b692_e63ea3a3d44a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a06fd1_a941_427d_b692_e63ea3a3d44a.slice/crio-567a4af59058b3e8332134a4ff57dbad9307c831b454c022bec5ed048a1a0d63\": RecentStats: unable to find data in memory cache]" Oct 06 13:33:30 crc kubenswrapper[4867]: I1006 13:33:30.992942 4867 generic.go:334] "Generic (PLEG): container finished" podID="4c45ecf4-2135-407f-ab03-6c1571cd3f76" containerID="464d87b8c990415a1642baa673b53891f5c07fdcef59bf066a76594bfaec8640" exitCode=0 Oct 06 13:33:30 crc kubenswrapper[4867]: I1006 13:33:30.993572 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" event={"ID":"4c45ecf4-2135-407f-ab03-6c1571cd3f76","Type":"ContainerDied","Data":"464d87b8c990415a1642baa673b53891f5c07fdcef59bf066a76594bfaec8640"} Oct 06 13:33:32 crc kubenswrapper[4867]: I1006 13:33:32.516818 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" Oct 06 13:33:32 crc kubenswrapper[4867]: I1006 13:33:32.640945 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kztm\" (UniqueName: \"kubernetes.io/projected/4c45ecf4-2135-407f-ab03-6c1571cd3f76-kube-api-access-4kztm\") pod \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\" (UID: \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\") " Oct 06 13:33:32 crc kubenswrapper[4867]: I1006 13:33:32.641029 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c45ecf4-2135-407f-ab03-6c1571cd3f76-ssh-key\") pod \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\" (UID: \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\") " Oct 06 13:33:32 crc kubenswrapper[4867]: I1006 13:33:32.641090 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c45ecf4-2135-407f-ab03-6c1571cd3f76-inventory\") pod \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\" (UID: \"4c45ecf4-2135-407f-ab03-6c1571cd3f76\") " Oct 06 13:33:32 crc kubenswrapper[4867]: I1006 13:33:32.646997 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c45ecf4-2135-407f-ab03-6c1571cd3f76-kube-api-access-4kztm" (OuterVolumeSpecName: "kube-api-access-4kztm") pod "4c45ecf4-2135-407f-ab03-6c1571cd3f76" (UID: "4c45ecf4-2135-407f-ab03-6c1571cd3f76"). InnerVolumeSpecName "kube-api-access-4kztm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:33:32 crc kubenswrapper[4867]: I1006 13:33:32.678392 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c45ecf4-2135-407f-ab03-6c1571cd3f76-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4c45ecf4-2135-407f-ab03-6c1571cd3f76" (UID: "4c45ecf4-2135-407f-ab03-6c1571cd3f76"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:33:32 crc kubenswrapper[4867]: I1006 13:33:32.678576 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c45ecf4-2135-407f-ab03-6c1571cd3f76-inventory" (OuterVolumeSpecName: "inventory") pod "4c45ecf4-2135-407f-ab03-6c1571cd3f76" (UID: "4c45ecf4-2135-407f-ab03-6c1571cd3f76"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:33:32 crc kubenswrapper[4867]: I1006 13:33:32.744245 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kztm\" (UniqueName: \"kubernetes.io/projected/4c45ecf4-2135-407f-ab03-6c1571cd3f76-kube-api-access-4kztm\") on node \"crc\" DevicePath \"\"" Oct 06 13:33:32 crc kubenswrapper[4867]: I1006 13:33:32.744319 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c45ecf4-2135-407f-ab03-6c1571cd3f76-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:33:32 crc kubenswrapper[4867]: I1006 13:33:32.744333 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c45ecf4-2135-407f-ab03-6c1571cd3f76-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.018401 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" event={"ID":"4c45ecf4-2135-407f-ab03-6c1571cd3f76","Type":"ContainerDied","Data":"554d202e54dfe79f3e2831641ae23f4ed157f9cc8a85e775c9127bfa14b07fbb"} Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.018452 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="554d202e54dfe79f3e2831641ae23f4ed157f9cc8a85e775c9127bfa14b07fbb" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.018481 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.132105 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs"] Oct 06 13:33:33 crc kubenswrapper[4867]: E1006 13:33:33.133557 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c45ecf4-2135-407f-ab03-6c1571cd3f76" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.133583 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c45ecf4-2135-407f-ab03-6c1571cd3f76" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.133827 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c45ecf4-2135-407f-ab03-6c1571cd3f76" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.134613 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.140775 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.140981 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.141120 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.141239 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.157157 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs"] Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.254770 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/828fbe77-2fb8-4ed5-b64f-733c1dad834d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qbxjs\" (UID: \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.254853 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/828fbe77-2fb8-4ed5-b64f-733c1dad834d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qbxjs\" (UID: \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.254990 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxfkn\" (UniqueName: \"kubernetes.io/projected/828fbe77-2fb8-4ed5-b64f-733c1dad834d-kube-api-access-qxfkn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qbxjs\" (UID: \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.358604 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxfkn\" (UniqueName: \"kubernetes.io/projected/828fbe77-2fb8-4ed5-b64f-733c1dad834d-kube-api-access-qxfkn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qbxjs\" (UID: \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.358909 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/828fbe77-2fb8-4ed5-b64f-733c1dad834d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qbxjs\" (UID: \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.358993 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/828fbe77-2fb8-4ed5-b64f-733c1dad834d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qbxjs\" (UID: \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.364043 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/828fbe77-2fb8-4ed5-b64f-733c1dad834d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qbxjs\" (UID: \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.364388 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/828fbe77-2fb8-4ed5-b64f-733c1dad834d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qbxjs\" (UID: \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.379661 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxfkn\" (UniqueName: \"kubernetes.io/projected/828fbe77-2fb8-4ed5-b64f-733c1dad834d-kube-api-access-qxfkn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qbxjs\" (UID: \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" Oct 06 13:33:33 crc kubenswrapper[4867]: I1006 13:33:33.456547 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" Oct 06 13:33:34 crc kubenswrapper[4867]: W1006 13:33:34.120264 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod828fbe77_2fb8_4ed5_b64f_733c1dad834d.slice/crio-be973a8de0dda049480f0805a208200dd094d5d58650831619143a1ad273509e WatchSource:0}: Error finding container be973a8de0dda049480f0805a208200dd094d5d58650831619143a1ad273509e: Status 404 returned error can't find the container with id be973a8de0dda049480f0805a208200dd094d5d58650831619143a1ad273509e Oct 06 13:33:34 crc kubenswrapper[4867]: I1006 13:33:34.124357 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs"] Oct 06 13:33:35 crc kubenswrapper[4867]: I1006 13:33:35.040143 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" event={"ID":"828fbe77-2fb8-4ed5-b64f-733c1dad834d","Type":"ContainerStarted","Data":"488b1b8bcc1d66f22f5150763cb1eab186f1a4c2d7786400c49ed32f8a15c9eb"} Oct 06 13:33:35 crc kubenswrapper[4867]: I1006 13:33:35.040589 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" event={"ID":"828fbe77-2fb8-4ed5-b64f-733c1dad834d","Type":"ContainerStarted","Data":"be973a8de0dda049480f0805a208200dd094d5d58650831619143a1ad273509e"} Oct 06 13:33:35 crc kubenswrapper[4867]: I1006 13:33:35.058076 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" podStartSLOduration=1.507852052 podStartE2EDuration="2.058050423s" podCreationTimestamp="2025-10-06 13:33:33 +0000 UTC" firstStartedPulling="2025-10-06 13:33:34.124811438 +0000 UTC m=+1793.582759582" lastFinishedPulling="2025-10-06 13:33:34.675009809 +0000 UTC m=+1794.132957953" observedRunningTime="2025-10-06 13:33:35.053742505 +0000 UTC m=+1794.511690659" watchObservedRunningTime="2025-10-06 13:33:35.058050423 +0000 UTC m=+1794.515998567" Oct 06 13:33:38 crc kubenswrapper[4867]: E1006 13:33:38.795801 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a06fd1_a941_427d_b692_e63ea3a3d44a.slice/crio-567a4af59058b3e8332134a4ff57dbad9307c831b454c022bec5ed048a1a0d63\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a06fd1_a941_427d_b692_e63ea3a3d44a.slice\": RecentStats: unable to find data in memory cache]" Oct 06 13:33:39 crc kubenswrapper[4867]: I1006 13:33:39.044778 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6c46-account-create-f62ww"] Oct 06 13:33:39 crc kubenswrapper[4867]: I1006 13:33:39.052235 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6c46-account-create-f62ww"] Oct 06 13:33:39 crc kubenswrapper[4867]: I1006 13:33:39.240896 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114" path="/var/lib/kubelet/pods/1b7b9ef9-9eb7-4b0a-8e49-e7ba2c6d1114/volumes" Oct 06 13:33:40 crc kubenswrapper[4867]: I1006 13:33:40.040103 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-33f7-account-create-t9qfg"] Oct 06 13:33:40 crc kubenswrapper[4867]: I1006 13:33:40.050064 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-33f7-account-create-t9qfg"] Oct 06 13:33:40 crc kubenswrapper[4867]: I1006 13:33:40.221084 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:33:40 crc kubenswrapper[4867]: E1006 13:33:40.221548 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:33:41 crc kubenswrapper[4867]: I1006 13:33:41.048410 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ea86-account-create-jdmnp"] Oct 06 13:33:41 crc kubenswrapper[4867]: I1006 13:33:41.089252 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ea86-account-create-jdmnp"] Oct 06 13:33:41 crc kubenswrapper[4867]: I1006 13:33:41.232920 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40974151-3ffd-421d-8a05-acb6b0ea7f0e" path="/var/lib/kubelet/pods/40974151-3ffd-421d-8a05-acb6b0ea7f0e/volumes" Oct 06 13:33:41 crc kubenswrapper[4867]: I1006 13:33:41.233492 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9250a9b-2ae9-4759-ad53-a8ff5016200f" path="/var/lib/kubelet/pods/c9250a9b-2ae9-4759-ad53-a8ff5016200f/volumes" Oct 06 13:33:55 crc kubenswrapper[4867]: I1006 13:33:55.221643 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:33:55 crc kubenswrapper[4867]: E1006 13:33:55.222513 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:34:01 crc kubenswrapper[4867]: I1006 13:34:01.561233 4867 scope.go:117] "RemoveContainer" containerID="7558fb15c1acf747f5b9ae73be1d9d7d0898ae57dcbdf4cb85aa0ce1a31db1b6" Oct 06 13:34:01 crc kubenswrapper[4867]: I1006 13:34:01.585328 4867 scope.go:117] "RemoveContainer" containerID="84d23888b58dea6466f6c6a45b07358d5daf7bd5d55f1211cd35aec12a43554c" Oct 06 13:34:01 crc kubenswrapper[4867]: I1006 13:34:01.632917 4867 scope.go:117] "RemoveContainer" containerID="f04328fce3f0e8db1446137312663ce22cfd5858142bb839f1b6de6c54f1c9cf" Oct 06 13:34:01 crc kubenswrapper[4867]: I1006 13:34:01.689813 4867 scope.go:117] "RemoveContainer" containerID="b477480223cd8714c5f5d8ea9f8597383152d9544938c31b2d4e72bb88acb43b" Oct 06 13:34:01 crc kubenswrapper[4867]: I1006 13:34:01.726693 4867 scope.go:117] "RemoveContainer" containerID="d188d39a7decbb3c975c7d397a2dcf4525b0f90cf4b59da34e5014a956f8f055" Oct 06 13:34:01 crc kubenswrapper[4867]: I1006 13:34:01.796272 4867 scope.go:117] "RemoveContainer" containerID="c3d92978090554240239a433470df8ae70cdbb5a44d699d61300cf84f5db208b" Oct 06 13:34:06 crc kubenswrapper[4867]: I1006 13:34:06.053157 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6jf29"] Oct 06 13:34:06 crc kubenswrapper[4867]: I1006 13:34:06.060931 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6jf29"] Oct 06 13:34:07 crc kubenswrapper[4867]: I1006 13:34:07.233101 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76c135ec-9d78-4745-b065-e035c34fc51c" path="/var/lib/kubelet/pods/76c135ec-9d78-4745-b065-e035c34fc51c/volumes" Oct 06 13:34:09 crc kubenswrapper[4867]: I1006 13:34:09.222098 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:34:09 crc kubenswrapper[4867]: E1006 13:34:09.223151 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:34:20 crc kubenswrapper[4867]: I1006 13:34:20.530423 4867 generic.go:334] "Generic (PLEG): container finished" podID="828fbe77-2fb8-4ed5-b64f-733c1dad834d" containerID="488b1b8bcc1d66f22f5150763cb1eab186f1a4c2d7786400c49ed32f8a15c9eb" exitCode=0 Oct 06 13:34:20 crc kubenswrapper[4867]: I1006 13:34:20.530540 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" event={"ID":"828fbe77-2fb8-4ed5-b64f-733c1dad834d","Type":"ContainerDied","Data":"488b1b8bcc1d66f22f5150763cb1eab186f1a4c2d7786400c49ed32f8a15c9eb"} Oct 06 13:34:21 crc kubenswrapper[4867]: I1006 13:34:21.233400 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:34:21 crc kubenswrapper[4867]: E1006 13:34:21.235015 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.050498 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.132053 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/828fbe77-2fb8-4ed5-b64f-733c1dad834d-inventory\") pod \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\" (UID: \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\") " Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.132916 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/828fbe77-2fb8-4ed5-b64f-733c1dad834d-ssh-key\") pod \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\" (UID: \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\") " Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.133131 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxfkn\" (UniqueName: \"kubernetes.io/projected/828fbe77-2fb8-4ed5-b64f-733c1dad834d-kube-api-access-qxfkn\") pod \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\" (UID: \"828fbe77-2fb8-4ed5-b64f-733c1dad834d\") " Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.140988 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/828fbe77-2fb8-4ed5-b64f-733c1dad834d-kube-api-access-qxfkn" (OuterVolumeSpecName: "kube-api-access-qxfkn") pod "828fbe77-2fb8-4ed5-b64f-733c1dad834d" (UID: "828fbe77-2fb8-4ed5-b64f-733c1dad834d"). InnerVolumeSpecName "kube-api-access-qxfkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.169527 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/828fbe77-2fb8-4ed5-b64f-733c1dad834d-inventory" (OuterVolumeSpecName: "inventory") pod "828fbe77-2fb8-4ed5-b64f-733c1dad834d" (UID: "828fbe77-2fb8-4ed5-b64f-733c1dad834d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.184143 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/828fbe77-2fb8-4ed5-b64f-733c1dad834d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "828fbe77-2fb8-4ed5-b64f-733c1dad834d" (UID: "828fbe77-2fb8-4ed5-b64f-733c1dad834d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.236577 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/828fbe77-2fb8-4ed5-b64f-733c1dad834d-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.236622 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/828fbe77-2fb8-4ed5-b64f-733c1dad834d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.236633 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxfkn\" (UniqueName: \"kubernetes.io/projected/828fbe77-2fb8-4ed5-b64f-733c1dad834d-kube-api-access-qxfkn\") on node \"crc\" DevicePath \"\"" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.625727 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" event={"ID":"828fbe77-2fb8-4ed5-b64f-733c1dad834d","Type":"ContainerDied","Data":"be973a8de0dda049480f0805a208200dd094d5d58650831619143a1ad273509e"} Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.626358 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be973a8de0dda049480f0805a208200dd094d5d58650831619143a1ad273509e" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.627073 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qbxjs" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.685580 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9"] Oct 06 13:34:22 crc kubenswrapper[4867]: E1006 13:34:22.693215 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="828fbe77-2fb8-4ed5-b64f-733c1dad834d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.693273 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="828fbe77-2fb8-4ed5-b64f-733c1dad834d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.693463 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="828fbe77-2fb8-4ed5-b64f-733c1dad834d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.694274 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.697560 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.697581 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.697924 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.698048 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.700729 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9"] Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.749100 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9\" (UID: \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.749489 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9\" (UID: \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.749716 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtvdr\" (UniqueName: \"kubernetes.io/projected/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-kube-api-access-rtvdr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9\" (UID: \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.851531 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9\" (UID: \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.851919 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9\" (UID: \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.852051 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtvdr\" (UniqueName: \"kubernetes.io/projected/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-kube-api-access-rtvdr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9\" (UID: \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.856521 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9\" (UID: \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.867506 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9\" (UID: \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" Oct 06 13:34:22 crc kubenswrapper[4867]: I1006 13:34:22.871172 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtvdr\" (UniqueName: \"kubernetes.io/projected/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-kube-api-access-rtvdr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9\" (UID: \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" Oct 06 13:34:23 crc kubenswrapper[4867]: I1006 13:34:23.016155 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" Oct 06 13:34:23 crc kubenswrapper[4867]: I1006 13:34:23.595951 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9"] Oct 06 13:34:23 crc kubenswrapper[4867]: I1006 13:34:23.639111 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" event={"ID":"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee","Type":"ContainerStarted","Data":"9eba7f0ad94eed7d114bdf5f36f52cd00c1807167005474e546e75f5fad2e040"} Oct 06 13:34:24 crc kubenswrapper[4867]: I1006 13:34:24.650412 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" event={"ID":"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee","Type":"ContainerStarted","Data":"8c10f6e2c49c84bae5eb9ed0f00a04806a755a7663c992c27abd3c60ca0a3f3a"} Oct 06 13:34:24 crc kubenswrapper[4867]: I1006 13:34:24.678605 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" podStartSLOduration=2.2649396729999998 podStartE2EDuration="2.678578072s" podCreationTimestamp="2025-10-06 13:34:22 +0000 UTC" firstStartedPulling="2025-10-06 13:34:23.59781565 +0000 UTC m=+1843.055763804" lastFinishedPulling="2025-10-06 13:34:24.011454059 +0000 UTC m=+1843.469402203" observedRunningTime="2025-10-06 13:34:24.670201963 +0000 UTC m=+1844.128150117" watchObservedRunningTime="2025-10-06 13:34:24.678578072 +0000 UTC m=+1844.136526216" Oct 06 13:34:31 crc kubenswrapper[4867]: I1006 13:34:31.048050 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-6t4g8"] Oct 06 13:34:31 crc kubenswrapper[4867]: I1006 13:34:31.062309 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-6t4g8"] Oct 06 13:34:31 crc kubenswrapper[4867]: I1006 13:34:31.232656 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e0f139d-a521-4cc0-95cd-76e5f7f4d8de" path="/var/lib/kubelet/pods/6e0f139d-a521-4cc0-95cd-76e5f7f4d8de/volumes" Oct 06 13:34:32 crc kubenswrapper[4867]: I1006 13:34:32.222125 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:34:32 crc kubenswrapper[4867]: E1006 13:34:32.222772 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:34:33 crc kubenswrapper[4867]: I1006 13:34:33.025209 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-45s42"] Oct 06 13:34:33 crc kubenswrapper[4867]: I1006 13:34:33.033369 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-45s42"] Oct 06 13:34:33 crc kubenswrapper[4867]: I1006 13:34:33.239792 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f6b405a-2a1a-4e35-b7e4-b5067e48fe18" path="/var/lib/kubelet/pods/8f6b405a-2a1a-4e35-b7e4-b5067e48fe18/volumes" Oct 06 13:34:47 crc kubenswrapper[4867]: I1006 13:34:47.220998 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:34:47 crc kubenswrapper[4867]: E1006 13:34:47.223411 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:34:59 crc kubenswrapper[4867]: I1006 13:34:59.221766 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:34:59 crc kubenswrapper[4867]: E1006 13:34:59.222716 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:35:01 crc kubenswrapper[4867]: I1006 13:35:01.918469 4867 scope.go:117] "RemoveContainer" containerID="f6b312f605aa716b29d47faf3a897e9c60c9bcb109435e203fe7a221ed7b755d" Oct 06 13:35:01 crc kubenswrapper[4867]: I1006 13:35:01.975175 4867 scope.go:117] "RemoveContainer" containerID="36e9525b08a2a448d75104b777e67afd6e6fbbd4a8f03f3f9412b4db00b869d0" Oct 06 13:35:02 crc kubenswrapper[4867]: I1006 13:35:02.124070 4867 scope.go:117] "RemoveContainer" containerID="9af65740392f896f693d7a00a116eee6637ce08de122e10313b26cc6d49e3251" Oct 06 13:35:12 crc kubenswrapper[4867]: I1006 13:35:12.222431 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:35:12 crc kubenswrapper[4867]: E1006 13:35:12.223771 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:35:15 crc kubenswrapper[4867]: I1006 13:35:15.048711 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-jvnnk"] Oct 06 13:35:15 crc kubenswrapper[4867]: I1006 13:35:15.058684 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-jvnnk"] Oct 06 13:35:15 crc kubenswrapper[4867]: I1006 13:35:15.235729 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860" path="/var/lib/kubelet/pods/5b6f76f8-2a43-4ee8-ad14-5cba3dd4c860/volumes" Oct 06 13:35:26 crc kubenswrapper[4867]: I1006 13:35:26.319203 4867 generic.go:334] "Generic (PLEG): container finished" podID="8b54a7c9-430b-4dfc-9ffb-ae3c790372ee" containerID="8c10f6e2c49c84bae5eb9ed0f00a04806a755a7663c992c27abd3c60ca0a3f3a" exitCode=2 Oct 06 13:35:26 crc kubenswrapper[4867]: I1006 13:35:26.319414 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" event={"ID":"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee","Type":"ContainerDied","Data":"8c10f6e2c49c84bae5eb9ed0f00a04806a755a7663c992c27abd3c60ca0a3f3a"} Oct 06 13:35:27 crc kubenswrapper[4867]: I1006 13:35:27.222244 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:35:27 crc kubenswrapper[4867]: E1006 13:35:27.223324 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:35:27 crc kubenswrapper[4867]: I1006 13:35:27.763115 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" Oct 06 13:35:27 crc kubenswrapper[4867]: I1006 13:35:27.903521 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-ssh-key\") pod \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\" (UID: \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\") " Oct 06 13:35:27 crc kubenswrapper[4867]: I1006 13:35:27.903662 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtvdr\" (UniqueName: \"kubernetes.io/projected/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-kube-api-access-rtvdr\") pod \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\" (UID: \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\") " Oct 06 13:35:27 crc kubenswrapper[4867]: I1006 13:35:27.903892 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-inventory\") pod \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\" (UID: \"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee\") " Oct 06 13:35:27 crc kubenswrapper[4867]: I1006 13:35:27.911762 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-kube-api-access-rtvdr" (OuterVolumeSpecName: "kube-api-access-rtvdr") pod "8b54a7c9-430b-4dfc-9ffb-ae3c790372ee" (UID: "8b54a7c9-430b-4dfc-9ffb-ae3c790372ee"). InnerVolumeSpecName "kube-api-access-rtvdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:35:27 crc kubenswrapper[4867]: I1006 13:35:27.938227 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8b54a7c9-430b-4dfc-9ffb-ae3c790372ee" (UID: "8b54a7c9-430b-4dfc-9ffb-ae3c790372ee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:35:27 crc kubenswrapper[4867]: I1006 13:35:27.953103 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-inventory" (OuterVolumeSpecName: "inventory") pod "8b54a7c9-430b-4dfc-9ffb-ae3c790372ee" (UID: "8b54a7c9-430b-4dfc-9ffb-ae3c790372ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:35:28 crc kubenswrapper[4867]: I1006 13:35:28.007142 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:35:28 crc kubenswrapper[4867]: I1006 13:35:28.007184 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:35:28 crc kubenswrapper[4867]: I1006 13:35:28.007198 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtvdr\" (UniqueName: \"kubernetes.io/projected/8b54a7c9-430b-4dfc-9ffb-ae3c790372ee-kube-api-access-rtvdr\") on node \"crc\" DevicePath \"\"" Oct 06 13:35:28 crc kubenswrapper[4867]: I1006 13:35:28.364535 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" event={"ID":"8b54a7c9-430b-4dfc-9ffb-ae3c790372ee","Type":"ContainerDied","Data":"9eba7f0ad94eed7d114bdf5f36f52cd00c1807167005474e546e75f5fad2e040"} Oct 06 13:35:28 crc kubenswrapper[4867]: I1006 13:35:28.364600 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9" Oct 06 13:35:28 crc kubenswrapper[4867]: I1006 13:35:28.364611 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eba7f0ad94eed7d114bdf5f36f52cd00c1807167005474e546e75f5fad2e040" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.040042 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl"] Oct 06 13:35:35 crc kubenswrapper[4867]: E1006 13:35:35.041512 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b54a7c9-430b-4dfc-9ffb-ae3c790372ee" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.041534 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b54a7c9-430b-4dfc-9ffb-ae3c790372ee" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.042059 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b54a7c9-430b-4dfc-9ffb-ae3c790372ee" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.043228 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.048155 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.048505 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.049060 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.050673 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.067724 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl"] Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.181435 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7zljl\" (UID: \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.181536 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdw7t\" (UniqueName: \"kubernetes.io/projected/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-kube-api-access-bdw7t\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7zljl\" (UID: \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.181828 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7zljl\" (UID: \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.284687 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7zljl\" (UID: \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.285201 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7zljl\" (UID: \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.285263 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdw7t\" (UniqueName: \"kubernetes.io/projected/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-kube-api-access-bdw7t\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7zljl\" (UID: \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.294697 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7zljl\" (UID: \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.294791 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7zljl\" (UID: \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.305294 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdw7t\" (UniqueName: \"kubernetes.io/projected/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-kube-api-access-bdw7t\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7zljl\" (UID: \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.375123 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.822200 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl"] Oct 06 13:35:35 crc kubenswrapper[4867]: I1006 13:35:35.844115 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:35:36 crc kubenswrapper[4867]: I1006 13:35:36.457087 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" event={"ID":"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c","Type":"ContainerStarted","Data":"ffc8827e9781f49a2c727832143808c9106b3731dc0c2f01b1b09ff1e7ea1cf6"} Oct 06 13:35:37 crc kubenswrapper[4867]: I1006 13:35:37.474808 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" event={"ID":"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c","Type":"ContainerStarted","Data":"f6aed41face7d038899659e893c6d43537d9b87a226622ac25b2590d4be7b03b"} Oct 06 13:35:37 crc kubenswrapper[4867]: I1006 13:35:37.509558 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" podStartSLOduration=1.652043683 podStartE2EDuration="2.509534926s" podCreationTimestamp="2025-10-06 13:35:35 +0000 UTC" firstStartedPulling="2025-10-06 13:35:35.843847666 +0000 UTC m=+1915.301795810" lastFinishedPulling="2025-10-06 13:35:36.701338869 +0000 UTC m=+1916.159287053" observedRunningTime="2025-10-06 13:35:37.502671678 +0000 UTC m=+1916.960619832" watchObservedRunningTime="2025-10-06 13:35:37.509534926 +0000 UTC m=+1916.967483070" Oct 06 13:35:40 crc kubenswrapper[4867]: I1006 13:35:40.222411 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:35:40 crc kubenswrapper[4867]: E1006 13:35:40.223354 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:35:51 crc kubenswrapper[4867]: I1006 13:35:51.232141 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:35:51 crc kubenswrapper[4867]: E1006 13:35:51.233284 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:36:02 crc kubenswrapper[4867]: I1006 13:36:02.274955 4867 scope.go:117] "RemoveContainer" containerID="89a7d2fda9e6587b25239bebd4814fcae3f31f10adb9606221a28ace9295038f" Oct 06 13:36:04 crc kubenswrapper[4867]: I1006 13:36:04.221646 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:36:04 crc kubenswrapper[4867]: E1006 13:36:04.223569 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:36:17 crc kubenswrapper[4867]: I1006 13:36:17.221017 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:36:17 crc kubenswrapper[4867]: E1006 13:36:17.222010 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:36:28 crc kubenswrapper[4867]: I1006 13:36:28.222642 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:36:28 crc kubenswrapper[4867]: E1006 13:36:28.223853 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:36:29 crc kubenswrapper[4867]: I1006 13:36:29.011607 4867 generic.go:334] "Generic (PLEG): container finished" podID="cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c" containerID="f6aed41face7d038899659e893c6d43537d9b87a226622ac25b2590d4be7b03b" exitCode=0 Oct 06 13:36:29 crc kubenswrapper[4867]: I1006 13:36:29.011805 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" event={"ID":"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c","Type":"ContainerDied","Data":"f6aed41face7d038899659e893c6d43537d9b87a226622ac25b2590d4be7b03b"} Oct 06 13:36:30 crc kubenswrapper[4867]: I1006 13:36:30.505924 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" Oct 06 13:36:30 crc kubenswrapper[4867]: I1006 13:36:30.598840 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-inventory\") pod \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\" (UID: \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\") " Oct 06 13:36:30 crc kubenswrapper[4867]: I1006 13:36:30.598872 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-ssh-key\") pod \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\" (UID: \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\") " Oct 06 13:36:30 crc kubenswrapper[4867]: I1006 13:36:30.599010 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdw7t\" (UniqueName: \"kubernetes.io/projected/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-kube-api-access-bdw7t\") pod \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\" (UID: \"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c\") " Oct 06 13:36:30 crc kubenswrapper[4867]: I1006 13:36:30.605418 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-kube-api-access-bdw7t" (OuterVolumeSpecName: "kube-api-access-bdw7t") pod "cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c" (UID: "cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c"). InnerVolumeSpecName "kube-api-access-bdw7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:36:30 crc kubenswrapper[4867]: I1006 13:36:30.627738 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-inventory" (OuterVolumeSpecName: "inventory") pod "cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c" (UID: "cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:36:30 crc kubenswrapper[4867]: I1006 13:36:30.630022 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c" (UID: "cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:36:30 crc kubenswrapper[4867]: I1006 13:36:30.701899 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdw7t\" (UniqueName: \"kubernetes.io/projected/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-kube-api-access-bdw7t\") on node \"crc\" DevicePath \"\"" Oct 06 13:36:30 crc kubenswrapper[4867]: I1006 13:36:30.701936 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:36:30 crc kubenswrapper[4867]: I1006 13:36:30.701949 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.036507 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" event={"ID":"cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c","Type":"ContainerDied","Data":"ffc8827e9781f49a2c727832143808c9106b3731dc0c2f01b1b09ff1e7ea1cf6"} Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.036956 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffc8827e9781f49a2c727832143808c9106b3731dc0c2f01b1b09ff1e7ea1cf6" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.036577 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7zljl" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.144941 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-smg6g"] Oct 06 13:36:31 crc kubenswrapper[4867]: E1006 13:36:31.145416 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.145434 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.145713 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.146520 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.148927 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.149642 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.149759 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.152893 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.164852 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-smg6g"] Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.314654 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b12f715-2704-4545-a627-39426cb3de93-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-smg6g\" (UID: \"4b12f715-2704-4545-a627-39426cb3de93\") " pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.314732 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4b12f715-2704-4545-a627-39426cb3de93-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-smg6g\" (UID: \"4b12f715-2704-4545-a627-39426cb3de93\") " pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.315214 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tmn8\" (UniqueName: \"kubernetes.io/projected/4b12f715-2704-4545-a627-39426cb3de93-kube-api-access-9tmn8\") pod \"ssh-known-hosts-edpm-deployment-smg6g\" (UID: \"4b12f715-2704-4545-a627-39426cb3de93\") " pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.417855 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tmn8\" (UniqueName: \"kubernetes.io/projected/4b12f715-2704-4545-a627-39426cb3de93-kube-api-access-9tmn8\") pod \"ssh-known-hosts-edpm-deployment-smg6g\" (UID: \"4b12f715-2704-4545-a627-39426cb3de93\") " pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.418010 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b12f715-2704-4545-a627-39426cb3de93-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-smg6g\" (UID: \"4b12f715-2704-4545-a627-39426cb3de93\") " pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.418073 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4b12f715-2704-4545-a627-39426cb3de93-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-smg6g\" (UID: \"4b12f715-2704-4545-a627-39426cb3de93\") " pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.423899 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b12f715-2704-4545-a627-39426cb3de93-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-smg6g\" (UID: \"4b12f715-2704-4545-a627-39426cb3de93\") " pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.424948 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4b12f715-2704-4545-a627-39426cb3de93-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-smg6g\" (UID: \"4b12f715-2704-4545-a627-39426cb3de93\") " pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.439672 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tmn8\" (UniqueName: \"kubernetes.io/projected/4b12f715-2704-4545-a627-39426cb3de93-kube-api-access-9tmn8\") pod \"ssh-known-hosts-edpm-deployment-smg6g\" (UID: \"4b12f715-2704-4545-a627-39426cb3de93\") " pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" Oct 06 13:36:31 crc kubenswrapper[4867]: I1006 13:36:31.481852 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" Oct 06 13:36:32 crc kubenswrapper[4867]: I1006 13:36:32.101777 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-smg6g"] Oct 06 13:36:33 crc kubenswrapper[4867]: I1006 13:36:33.061299 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" event={"ID":"4b12f715-2704-4545-a627-39426cb3de93","Type":"ContainerStarted","Data":"a9a75cb2dfd62479fc450623c297fe3eb40f603b32fad3be58eb6a4c3d4ba81f"} Oct 06 13:36:33 crc kubenswrapper[4867]: I1006 13:36:33.061990 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" event={"ID":"4b12f715-2704-4545-a627-39426cb3de93","Type":"ContainerStarted","Data":"5aced47ef30af147aaf24f4f2c154f416cc5832f82837a94b9789b7fdcd8ed39"} Oct 06 13:36:33 crc kubenswrapper[4867]: I1006 13:36:33.086082 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" podStartSLOduration=1.429222155 podStartE2EDuration="2.086052856s" podCreationTimestamp="2025-10-06 13:36:31 +0000 UTC" firstStartedPulling="2025-10-06 13:36:32.098772399 +0000 UTC m=+1971.556720543" lastFinishedPulling="2025-10-06 13:36:32.7556031 +0000 UTC m=+1972.213551244" observedRunningTime="2025-10-06 13:36:33.080389182 +0000 UTC m=+1972.538337356" watchObservedRunningTime="2025-10-06 13:36:33.086052856 +0000 UTC m=+1972.544001030" Oct 06 13:36:41 crc kubenswrapper[4867]: I1006 13:36:41.169628 4867 generic.go:334] "Generic (PLEG): container finished" podID="4b12f715-2704-4545-a627-39426cb3de93" containerID="a9a75cb2dfd62479fc450623c297fe3eb40f603b32fad3be58eb6a4c3d4ba81f" exitCode=0 Oct 06 13:36:41 crc kubenswrapper[4867]: I1006 13:36:41.170727 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" event={"ID":"4b12f715-2704-4545-a627-39426cb3de93","Type":"ContainerDied","Data":"a9a75cb2dfd62479fc450623c297fe3eb40f603b32fad3be58eb6a4c3d4ba81f"} Oct 06 13:36:42 crc kubenswrapper[4867]: I1006 13:36:42.222681 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:36:42 crc kubenswrapper[4867]: E1006 13:36:42.223085 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:36:42 crc kubenswrapper[4867]: I1006 13:36:42.681025 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" Oct 06 13:36:42 crc kubenswrapper[4867]: I1006 13:36:42.815357 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tmn8\" (UniqueName: \"kubernetes.io/projected/4b12f715-2704-4545-a627-39426cb3de93-kube-api-access-9tmn8\") pod \"4b12f715-2704-4545-a627-39426cb3de93\" (UID: \"4b12f715-2704-4545-a627-39426cb3de93\") " Oct 06 13:36:42 crc kubenswrapper[4867]: I1006 13:36:42.815591 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4b12f715-2704-4545-a627-39426cb3de93-inventory-0\") pod \"4b12f715-2704-4545-a627-39426cb3de93\" (UID: \"4b12f715-2704-4545-a627-39426cb3de93\") " Oct 06 13:36:42 crc kubenswrapper[4867]: I1006 13:36:42.816029 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b12f715-2704-4545-a627-39426cb3de93-ssh-key-openstack-edpm-ipam\") pod \"4b12f715-2704-4545-a627-39426cb3de93\" (UID: \"4b12f715-2704-4545-a627-39426cb3de93\") " Oct 06 13:36:42 crc kubenswrapper[4867]: I1006 13:36:42.821818 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b12f715-2704-4545-a627-39426cb3de93-kube-api-access-9tmn8" (OuterVolumeSpecName: "kube-api-access-9tmn8") pod "4b12f715-2704-4545-a627-39426cb3de93" (UID: "4b12f715-2704-4545-a627-39426cb3de93"). InnerVolumeSpecName "kube-api-access-9tmn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:36:42 crc kubenswrapper[4867]: I1006 13:36:42.847173 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b12f715-2704-4545-a627-39426cb3de93-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4b12f715-2704-4545-a627-39426cb3de93" (UID: "4b12f715-2704-4545-a627-39426cb3de93"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:36:42 crc kubenswrapper[4867]: I1006 13:36:42.848894 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b12f715-2704-4545-a627-39426cb3de93-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4b12f715-2704-4545-a627-39426cb3de93" (UID: "4b12f715-2704-4545-a627-39426cb3de93"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:36:42 crc kubenswrapper[4867]: I1006 13:36:42.919616 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b12f715-2704-4545-a627-39426cb3de93-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 06 13:36:42 crc kubenswrapper[4867]: I1006 13:36:42.919660 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tmn8\" (UniqueName: \"kubernetes.io/projected/4b12f715-2704-4545-a627-39426cb3de93-kube-api-access-9tmn8\") on node \"crc\" DevicePath \"\"" Oct 06 13:36:42 crc kubenswrapper[4867]: I1006 13:36:42.919676 4867 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4b12f715-2704-4545-a627-39426cb3de93-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.194571 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" event={"ID":"4b12f715-2704-4545-a627-39426cb3de93","Type":"ContainerDied","Data":"5aced47ef30af147aaf24f4f2c154f416cc5832f82837a94b9789b7fdcd8ed39"} Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.194633 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aced47ef30af147aaf24f4f2c154f416cc5832f82837a94b9789b7fdcd8ed39" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.194671 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-smg6g" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.327872 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx"] Oct 06 13:36:43 crc kubenswrapper[4867]: E1006 13:36:43.328443 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b12f715-2704-4545-a627-39426cb3de93" containerName="ssh-known-hosts-edpm-deployment" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.328460 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b12f715-2704-4545-a627-39426cb3de93" containerName="ssh-known-hosts-edpm-deployment" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.328711 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b12f715-2704-4545-a627-39426cb3de93" containerName="ssh-known-hosts-edpm-deployment" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.329580 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.333028 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.333106 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.333409 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.334716 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.349146 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx"] Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.433808 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g49n\" (UniqueName: \"kubernetes.io/projected/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-kube-api-access-6g49n\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rbjrx\" (UID: \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.433909 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rbjrx\" (UID: \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.433949 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rbjrx\" (UID: \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.537321 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g49n\" (UniqueName: \"kubernetes.io/projected/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-kube-api-access-6g49n\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rbjrx\" (UID: \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.537430 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rbjrx\" (UID: \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.537486 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rbjrx\" (UID: \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.543107 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rbjrx\" (UID: \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.553174 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rbjrx\" (UID: \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.566127 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g49n\" (UniqueName: \"kubernetes.io/projected/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-kube-api-access-6g49n\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rbjrx\" (UID: \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" Oct 06 13:36:43 crc kubenswrapper[4867]: I1006 13:36:43.649299 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" Oct 06 13:36:44 crc kubenswrapper[4867]: I1006 13:36:44.342323 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx"] Oct 06 13:36:45 crc kubenswrapper[4867]: I1006 13:36:45.239123 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" event={"ID":"8dec95c2-2ac5-4886-b1e1-ab333d4f5907","Type":"ContainerStarted","Data":"9df6ea97f6654e1e522998f5a454c27cb572719db3c776807975b6e5f82c7bed"} Oct 06 13:36:45 crc kubenswrapper[4867]: I1006 13:36:45.239595 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" event={"ID":"8dec95c2-2ac5-4886-b1e1-ab333d4f5907","Type":"ContainerStarted","Data":"17a90554c5c1ebfed2bed84e167cb4d514ea87d2acee515a04ceb1b9055365ec"} Oct 06 13:36:45 crc kubenswrapper[4867]: I1006 13:36:45.252482 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" podStartSLOduration=1.7968543609999998 podStartE2EDuration="2.252457436s" podCreationTimestamp="2025-10-06 13:36:43 +0000 UTC" firstStartedPulling="2025-10-06 13:36:44.35788079 +0000 UTC m=+1983.815828934" lastFinishedPulling="2025-10-06 13:36:44.813483825 +0000 UTC m=+1984.271432009" observedRunningTime="2025-10-06 13:36:45.248058606 +0000 UTC m=+1984.706006760" watchObservedRunningTime="2025-10-06 13:36:45.252457436 +0000 UTC m=+1984.710405590" Oct 06 13:36:54 crc kubenswrapper[4867]: I1006 13:36:54.333555 4867 generic.go:334] "Generic (PLEG): container finished" podID="8dec95c2-2ac5-4886-b1e1-ab333d4f5907" containerID="9df6ea97f6654e1e522998f5a454c27cb572719db3c776807975b6e5f82c7bed" exitCode=0 Oct 06 13:36:54 crc kubenswrapper[4867]: I1006 13:36:54.333672 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" event={"ID":"8dec95c2-2ac5-4886-b1e1-ab333d4f5907","Type":"ContainerDied","Data":"9df6ea97f6654e1e522998f5a454c27cb572719db3c776807975b6e5f82c7bed"} Oct 06 13:36:55 crc kubenswrapper[4867]: I1006 13:36:55.222405 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:36:55 crc kubenswrapper[4867]: I1006 13:36:55.803944 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" Oct 06 13:36:55 crc kubenswrapper[4867]: I1006 13:36:55.962245 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-inventory\") pod \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\" (UID: \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\") " Oct 06 13:36:55 crc kubenswrapper[4867]: I1006 13:36:55.962448 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-ssh-key\") pod \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\" (UID: \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\") " Oct 06 13:36:55 crc kubenswrapper[4867]: I1006 13:36:55.962582 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g49n\" (UniqueName: \"kubernetes.io/projected/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-kube-api-access-6g49n\") pod \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\" (UID: \"8dec95c2-2ac5-4886-b1e1-ab333d4f5907\") " Oct 06 13:36:55 crc kubenswrapper[4867]: I1006 13:36:55.974809 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-kube-api-access-6g49n" (OuterVolumeSpecName: "kube-api-access-6g49n") pod "8dec95c2-2ac5-4886-b1e1-ab333d4f5907" (UID: "8dec95c2-2ac5-4886-b1e1-ab333d4f5907"). InnerVolumeSpecName "kube-api-access-6g49n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:36:55 crc kubenswrapper[4867]: I1006 13:36:55.997588 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-inventory" (OuterVolumeSpecName: "inventory") pod "8dec95c2-2ac5-4886-b1e1-ab333d4f5907" (UID: "8dec95c2-2ac5-4886-b1e1-ab333d4f5907"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.010865 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8dec95c2-2ac5-4886-b1e1-ab333d4f5907" (UID: "8dec95c2-2ac5-4886-b1e1-ab333d4f5907"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.065328 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.065373 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g49n\" (UniqueName: \"kubernetes.io/projected/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-kube-api-access-6g49n\") on node \"crc\" DevicePath \"\"" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.065390 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dec95c2-2ac5-4886-b1e1-ab333d4f5907-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.369391 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"a0b72170aaf35c6098ea3145446b3d986f52f72aeca02885baaa1bfa32d9e577"} Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.372675 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" event={"ID":"8dec95c2-2ac5-4886-b1e1-ab333d4f5907","Type":"ContainerDied","Data":"17a90554c5c1ebfed2bed84e167cb4d514ea87d2acee515a04ceb1b9055365ec"} Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.372723 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17a90554c5c1ebfed2bed84e167cb4d514ea87d2acee515a04ceb1b9055365ec" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.372780 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rbjrx" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.471567 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w"] Oct 06 13:36:56 crc kubenswrapper[4867]: E1006 13:36:56.472201 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dec95c2-2ac5-4886-b1e1-ab333d4f5907" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.472238 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dec95c2-2ac5-4886-b1e1-ab333d4f5907" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.472714 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dec95c2-2ac5-4886-b1e1-ab333d4f5907" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.474809 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.479719 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w"] Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.481910 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.482149 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.482421 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.482582 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.577025 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxswc\" (UniqueName: \"kubernetes.io/projected/5759403e-a3b6-4553-9e27-f471a616644f-kube-api-access-kxswc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w\" (UID: \"5759403e-a3b6-4553-9e27-f471a616644f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.577510 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5759403e-a3b6-4553-9e27-f471a616644f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w\" (UID: \"5759403e-a3b6-4553-9e27-f471a616644f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.577904 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5759403e-a3b6-4553-9e27-f471a616644f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w\" (UID: \"5759403e-a3b6-4553-9e27-f471a616644f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.680814 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxswc\" (UniqueName: \"kubernetes.io/projected/5759403e-a3b6-4553-9e27-f471a616644f-kube-api-access-kxswc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w\" (UID: \"5759403e-a3b6-4553-9e27-f471a616644f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.680896 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5759403e-a3b6-4553-9e27-f471a616644f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w\" (UID: \"5759403e-a3b6-4553-9e27-f471a616644f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.681061 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5759403e-a3b6-4553-9e27-f471a616644f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w\" (UID: \"5759403e-a3b6-4553-9e27-f471a616644f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.688538 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5759403e-a3b6-4553-9e27-f471a616644f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w\" (UID: \"5759403e-a3b6-4553-9e27-f471a616644f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.688948 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5759403e-a3b6-4553-9e27-f471a616644f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w\" (UID: \"5759403e-a3b6-4553-9e27-f471a616644f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.700592 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxswc\" (UniqueName: \"kubernetes.io/projected/5759403e-a3b6-4553-9e27-f471a616644f-kube-api-access-kxswc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w\" (UID: \"5759403e-a3b6-4553-9e27-f471a616644f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" Oct 06 13:36:56 crc kubenswrapper[4867]: I1006 13:36:56.806165 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" Oct 06 13:36:57 crc kubenswrapper[4867]: I1006 13:36:57.480207 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w"] Oct 06 13:36:57 crc kubenswrapper[4867]: W1006 13:36:57.494782 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5759403e_a3b6_4553_9e27_f471a616644f.slice/crio-ec2ac8264eda29364d40858420cb594dbb9514c39e41708866a911e08632aa41 WatchSource:0}: Error finding container ec2ac8264eda29364d40858420cb594dbb9514c39e41708866a911e08632aa41: Status 404 returned error can't find the container with id ec2ac8264eda29364d40858420cb594dbb9514c39e41708866a911e08632aa41 Oct 06 13:36:58 crc kubenswrapper[4867]: I1006 13:36:58.398515 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" event={"ID":"5759403e-a3b6-4553-9e27-f471a616644f","Type":"ContainerStarted","Data":"77ee80636fa8bf050f7d88b20ab3a1c9a3eafd5b0f4771bfca69b31c8f7b9715"} Oct 06 13:36:58 crc kubenswrapper[4867]: I1006 13:36:58.398875 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" event={"ID":"5759403e-a3b6-4553-9e27-f471a616644f","Type":"ContainerStarted","Data":"ec2ac8264eda29364d40858420cb594dbb9514c39e41708866a911e08632aa41"} Oct 06 13:36:58 crc kubenswrapper[4867]: I1006 13:36:58.420338 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" podStartSLOduration=2.026187444 podStartE2EDuration="2.42031842s" podCreationTimestamp="2025-10-06 13:36:56 +0000 UTC" firstStartedPulling="2025-10-06 13:36:57.499113157 +0000 UTC m=+1996.957061301" lastFinishedPulling="2025-10-06 13:36:57.893244093 +0000 UTC m=+1997.351192277" observedRunningTime="2025-10-06 13:36:58.41516994 +0000 UTC m=+1997.873118084" watchObservedRunningTime="2025-10-06 13:36:58.42031842 +0000 UTC m=+1997.878266564" Oct 06 13:37:08 crc kubenswrapper[4867]: I1006 13:37:08.494323 4867 generic.go:334] "Generic (PLEG): container finished" podID="5759403e-a3b6-4553-9e27-f471a616644f" containerID="77ee80636fa8bf050f7d88b20ab3a1c9a3eafd5b0f4771bfca69b31c8f7b9715" exitCode=0 Oct 06 13:37:08 crc kubenswrapper[4867]: I1006 13:37:08.494424 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" event={"ID":"5759403e-a3b6-4553-9e27-f471a616644f","Type":"ContainerDied","Data":"77ee80636fa8bf050f7d88b20ab3a1c9a3eafd5b0f4771bfca69b31c8f7b9715"} Oct 06 13:37:09 crc kubenswrapper[4867]: I1006 13:37:09.967217 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.139324 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxswc\" (UniqueName: \"kubernetes.io/projected/5759403e-a3b6-4553-9e27-f471a616644f-kube-api-access-kxswc\") pod \"5759403e-a3b6-4553-9e27-f471a616644f\" (UID: \"5759403e-a3b6-4553-9e27-f471a616644f\") " Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.139469 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5759403e-a3b6-4553-9e27-f471a616644f-ssh-key\") pod \"5759403e-a3b6-4553-9e27-f471a616644f\" (UID: \"5759403e-a3b6-4553-9e27-f471a616644f\") " Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.139620 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5759403e-a3b6-4553-9e27-f471a616644f-inventory\") pod \"5759403e-a3b6-4553-9e27-f471a616644f\" (UID: \"5759403e-a3b6-4553-9e27-f471a616644f\") " Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.154913 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5759403e-a3b6-4553-9e27-f471a616644f-kube-api-access-kxswc" (OuterVolumeSpecName: "kube-api-access-kxswc") pod "5759403e-a3b6-4553-9e27-f471a616644f" (UID: "5759403e-a3b6-4553-9e27-f471a616644f"). InnerVolumeSpecName "kube-api-access-kxswc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.183499 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5759403e-a3b6-4553-9e27-f471a616644f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5759403e-a3b6-4553-9e27-f471a616644f" (UID: "5759403e-a3b6-4553-9e27-f471a616644f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.191740 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5759403e-a3b6-4553-9e27-f471a616644f-inventory" (OuterVolumeSpecName: "inventory") pod "5759403e-a3b6-4553-9e27-f471a616644f" (UID: "5759403e-a3b6-4553-9e27-f471a616644f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.241856 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5759403e-a3b6-4553-9e27-f471a616644f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.243982 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5759403e-a3b6-4553-9e27-f471a616644f-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.244006 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxswc\" (UniqueName: \"kubernetes.io/projected/5759403e-a3b6-4553-9e27-f471a616644f-kube-api-access-kxswc\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.519366 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" event={"ID":"5759403e-a3b6-4553-9e27-f471a616644f","Type":"ContainerDied","Data":"ec2ac8264eda29364d40858420cb594dbb9514c39e41708866a911e08632aa41"} Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.519690 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec2ac8264eda29364d40858420cb594dbb9514c39e41708866a911e08632aa41" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.519492 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.623039 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf"] Oct 06 13:37:10 crc kubenswrapper[4867]: E1006 13:37:10.623759 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5759403e-a3b6-4553-9e27-f471a616644f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.623795 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5759403e-a3b6-4553-9e27-f471a616644f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.624127 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5759403e-a3b6-4553-9e27-f471a616644f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.625346 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.630042 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.676280 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.683813 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.685010 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.685316 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.685786 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.686984 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.696816 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf"] Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.698825 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.803113 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.803560 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.803665 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.803749 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.803828 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.803918 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.804053 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.804207 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.804300 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.804392 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.804500 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gp8l\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-kube-api-access-9gp8l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.804579 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.804681 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.804784 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.907058 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.907373 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.907478 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.907597 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gp8l\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-kube-api-access-9gp8l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.907680 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.907814 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.907910 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.908022 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.908143 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.908286 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.908412 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.908546 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.908655 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.908794 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.912353 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.912854 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.913071 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.914194 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.914576 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.914587 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.914920 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.915942 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.916348 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.916454 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.917448 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.919005 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.919926 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:10 crc kubenswrapper[4867]: I1006 13:37:10.926807 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gp8l\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-kube-api-access-9gp8l\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:11 crc kubenswrapper[4867]: I1006 13:37:11.004300 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:11 crc kubenswrapper[4867]: I1006 13:37:11.545155 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf"] Oct 06 13:37:12 crc kubenswrapper[4867]: I1006 13:37:12.543010 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" event={"ID":"4c656316-c726-4675-9209-cf119811bc63","Type":"ContainerStarted","Data":"77e7e44364ef3a16882788d00d10914fbdfe2eff3a6b2b6112cd6435f3de74b2"} Oct 06 13:37:17 crc kubenswrapper[4867]: I1006 13:37:17.592033 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" event={"ID":"4c656316-c726-4675-9209-cf119811bc63","Type":"ContainerStarted","Data":"708dd1561ca520dde921cafb01dd8a991419821861f112a128fe9145808275bb"} Oct 06 13:37:17 crc kubenswrapper[4867]: I1006 13:37:17.615163 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" podStartSLOduration=2.601657389 podStartE2EDuration="7.615143863s" podCreationTimestamp="2025-10-06 13:37:10 +0000 UTC" firstStartedPulling="2025-10-06 13:37:11.544204584 +0000 UTC m=+2011.002152748" lastFinishedPulling="2025-10-06 13:37:16.557691078 +0000 UTC m=+2016.015639222" observedRunningTime="2025-10-06 13:37:17.610737603 +0000 UTC m=+2017.068685747" watchObservedRunningTime="2025-10-06 13:37:17.615143863 +0000 UTC m=+2017.073092007" Oct 06 13:37:58 crc kubenswrapper[4867]: I1006 13:37:58.011708 4867 generic.go:334] "Generic (PLEG): container finished" podID="4c656316-c726-4675-9209-cf119811bc63" containerID="708dd1561ca520dde921cafb01dd8a991419821861f112a128fe9145808275bb" exitCode=0 Oct 06 13:37:58 crc kubenswrapper[4867]: I1006 13:37:58.011773 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" event={"ID":"4c656316-c726-4675-9209-cf119811bc63","Type":"ContainerDied","Data":"708dd1561ca520dde921cafb01dd8a991419821861f112a128fe9145808275bb"} Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.495890 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.612932 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-ssh-key\") pod \"4c656316-c726-4675-9209-cf119811bc63\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.613090 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"4c656316-c726-4675-9209-cf119811bc63\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.613144 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-neutron-metadata-combined-ca-bundle\") pod \"4c656316-c726-4675-9209-cf119811bc63\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.613229 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-bootstrap-combined-ca-bundle\") pod \"4c656316-c726-4675-9209-cf119811bc63\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.613267 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-ovn-default-certs-0\") pod \"4c656316-c726-4675-9209-cf119811bc63\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.613288 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"4c656316-c726-4675-9209-cf119811bc63\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.613356 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-libvirt-combined-ca-bundle\") pod \"4c656316-c726-4675-9209-cf119811bc63\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.613410 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-inventory\") pod \"4c656316-c726-4675-9209-cf119811bc63\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.613448 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-repo-setup-combined-ca-bundle\") pod \"4c656316-c726-4675-9209-cf119811bc63\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.613470 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-telemetry-combined-ca-bundle\") pod \"4c656316-c726-4675-9209-cf119811bc63\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.613496 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gp8l\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-kube-api-access-9gp8l\") pod \"4c656316-c726-4675-9209-cf119811bc63\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.613528 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-nova-combined-ca-bundle\") pod \"4c656316-c726-4675-9209-cf119811bc63\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.613555 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-ovn-combined-ca-bundle\") pod \"4c656316-c726-4675-9209-cf119811bc63\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.613597 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"4c656316-c726-4675-9209-cf119811bc63\" (UID: \"4c656316-c726-4675-9209-cf119811bc63\") " Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.620707 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4c656316-c726-4675-9209-cf119811bc63" (UID: "4c656316-c726-4675-9209-cf119811bc63"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.621422 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "4c656316-c726-4675-9209-cf119811bc63" (UID: "4c656316-c726-4675-9209-cf119811bc63"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.621568 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "4c656316-c726-4675-9209-cf119811bc63" (UID: "4c656316-c726-4675-9209-cf119811bc63"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.622610 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4c656316-c726-4675-9209-cf119811bc63" (UID: "4c656316-c726-4675-9209-cf119811bc63"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.623395 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-kube-api-access-9gp8l" (OuterVolumeSpecName: "kube-api-access-9gp8l") pod "4c656316-c726-4675-9209-cf119811bc63" (UID: "4c656316-c726-4675-9209-cf119811bc63"). InnerVolumeSpecName "kube-api-access-9gp8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.624323 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "4c656316-c726-4675-9209-cf119811bc63" (UID: "4c656316-c726-4675-9209-cf119811bc63"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.624392 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "4c656316-c726-4675-9209-cf119811bc63" (UID: "4c656316-c726-4675-9209-cf119811bc63"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.624408 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4c656316-c726-4675-9209-cf119811bc63" (UID: "4c656316-c726-4675-9209-cf119811bc63"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.626007 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "4c656316-c726-4675-9209-cf119811bc63" (UID: "4c656316-c726-4675-9209-cf119811bc63"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.626594 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "4c656316-c726-4675-9209-cf119811bc63" (UID: "4c656316-c726-4675-9209-cf119811bc63"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.627429 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "4c656316-c726-4675-9209-cf119811bc63" (UID: "4c656316-c726-4675-9209-cf119811bc63"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.633362 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "4c656316-c726-4675-9209-cf119811bc63" (UID: "4c656316-c726-4675-9209-cf119811bc63"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.651026 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-inventory" (OuterVolumeSpecName: "inventory") pod "4c656316-c726-4675-9209-cf119811bc63" (UID: "4c656316-c726-4675-9209-cf119811bc63"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.655156 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4c656316-c726-4675-9209-cf119811bc63" (UID: "4c656316-c726-4675-9209-cf119811bc63"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.716300 4867 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.716346 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.716363 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.716378 4867 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.716392 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.716406 4867 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.716420 4867 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.716432 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gp8l\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-kube-api-access-9gp8l\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.716445 4867 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.716457 4867 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.716471 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.716484 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.716498 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4c656316-c726-4675-9209-cf119811bc63-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.716513 4867 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c656316-c726-4675-9209-cf119811bc63-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.740981 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5n5sc"] Oct 06 13:37:59 crc kubenswrapper[4867]: E1006 13:37:59.741749 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c656316-c726-4675-9209-cf119811bc63" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.741847 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c656316-c726-4675-9209-cf119811bc63" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.742143 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c656316-c726-4675-9209-cf119811bc63" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.744166 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.760886 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5n5sc"] Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.818629 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z26j7\" (UniqueName: \"kubernetes.io/projected/f810aadd-e7db-4427-a6fd-34657dcf1100-kube-api-access-z26j7\") pod \"certified-operators-5n5sc\" (UID: \"f810aadd-e7db-4427-a6fd-34657dcf1100\") " pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.818843 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f810aadd-e7db-4427-a6fd-34657dcf1100-utilities\") pod \"certified-operators-5n5sc\" (UID: \"f810aadd-e7db-4427-a6fd-34657dcf1100\") " pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.818900 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f810aadd-e7db-4427-a6fd-34657dcf1100-catalog-content\") pod \"certified-operators-5n5sc\" (UID: \"f810aadd-e7db-4427-a6fd-34657dcf1100\") " pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.921360 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f810aadd-e7db-4427-a6fd-34657dcf1100-utilities\") pod \"certified-operators-5n5sc\" (UID: \"f810aadd-e7db-4427-a6fd-34657dcf1100\") " pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.921427 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f810aadd-e7db-4427-a6fd-34657dcf1100-catalog-content\") pod \"certified-operators-5n5sc\" (UID: \"f810aadd-e7db-4427-a6fd-34657dcf1100\") " pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.921503 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z26j7\" (UniqueName: \"kubernetes.io/projected/f810aadd-e7db-4427-a6fd-34657dcf1100-kube-api-access-z26j7\") pod \"certified-operators-5n5sc\" (UID: \"f810aadd-e7db-4427-a6fd-34657dcf1100\") " pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.921911 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f810aadd-e7db-4427-a6fd-34657dcf1100-utilities\") pod \"certified-operators-5n5sc\" (UID: \"f810aadd-e7db-4427-a6fd-34657dcf1100\") " pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.922080 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f810aadd-e7db-4427-a6fd-34657dcf1100-catalog-content\") pod \"certified-operators-5n5sc\" (UID: \"f810aadd-e7db-4427-a6fd-34657dcf1100\") " pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:37:59 crc kubenswrapper[4867]: I1006 13:37:59.942471 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z26j7\" (UniqueName: \"kubernetes.io/projected/f810aadd-e7db-4427-a6fd-34657dcf1100-kube-api-access-z26j7\") pod \"certified-operators-5n5sc\" (UID: \"f810aadd-e7db-4427-a6fd-34657dcf1100\") " pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.036071 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" event={"ID":"4c656316-c726-4675-9209-cf119811bc63","Type":"ContainerDied","Data":"77e7e44364ef3a16882788d00d10914fbdfe2eff3a6b2b6112cd6435f3de74b2"} Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.036117 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77e7e44364ef3a16882788d00d10914fbdfe2eff3a6b2b6112cd6435f3de74b2" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.036187 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.086454 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.206776 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx"] Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.208536 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.216560 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.224218 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx"] Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.234500 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.234610 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.234707 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.234729 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5bvt\" (UniqueName: \"kubernetes.io/projected/4960b423-de56-4b83-a577-f551c82c2702-kube-api-access-v5bvt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.234753 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4960b423-de56-4b83-a577-f551c82c2702-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.254837 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.255070 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.255177 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.255878 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.367604 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.367701 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.367801 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.367831 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5bvt\" (UniqueName: \"kubernetes.io/projected/4960b423-de56-4b83-a577-f551c82c2702-kube-api-access-v5bvt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.367854 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4960b423-de56-4b83-a577-f551c82c2702-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.368767 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4960b423-de56-4b83-a577-f551c82c2702-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.373057 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.376962 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.380864 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.396091 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5bvt\" (UniqueName: \"kubernetes.io/projected/4960b423-de56-4b83-a577-f551c82c2702-kube-api-access-v5bvt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7vftx\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.583433 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:38:00 crc kubenswrapper[4867]: I1006 13:38:00.732389 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5n5sc"] Oct 06 13:38:01 crc kubenswrapper[4867]: I1006 13:38:01.049644 4867 generic.go:334] "Generic (PLEG): container finished" podID="f810aadd-e7db-4427-a6fd-34657dcf1100" containerID="bdf3b96a622def0eedfbee3301dff6badff884bd8a9da18e003bea30321a67ff" exitCode=0 Oct 06 13:38:01 crc kubenswrapper[4867]: I1006 13:38:01.049842 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5n5sc" event={"ID":"f810aadd-e7db-4427-a6fd-34657dcf1100","Type":"ContainerDied","Data":"bdf3b96a622def0eedfbee3301dff6badff884bd8a9da18e003bea30321a67ff"} Oct 06 13:38:01 crc kubenswrapper[4867]: I1006 13:38:01.050057 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5n5sc" event={"ID":"f810aadd-e7db-4427-a6fd-34657dcf1100","Type":"ContainerStarted","Data":"26e4f5f2e5d4cffa56ee11515d923ccf94b143985af508782ff80d408642bb15"} Oct 06 13:38:01 crc kubenswrapper[4867]: I1006 13:38:01.118333 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx"] Oct 06 13:38:01 crc kubenswrapper[4867]: W1006 13:38:01.123985 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4960b423_de56_4b83_a577_f551c82c2702.slice/crio-2e1ac18a2d354a7ece019601ccb22d1eacc852110bc8b46967abfa996e1f96c5 WatchSource:0}: Error finding container 2e1ac18a2d354a7ece019601ccb22d1eacc852110bc8b46967abfa996e1f96c5: Status 404 returned error can't find the container with id 2e1ac18a2d354a7ece019601ccb22d1eacc852110bc8b46967abfa996e1f96c5 Oct 06 13:38:02 crc kubenswrapper[4867]: I1006 13:38:02.061223 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5n5sc" event={"ID":"f810aadd-e7db-4427-a6fd-34657dcf1100","Type":"ContainerStarted","Data":"6d26cfe9589718fb90731153fafcc18182f9c2e2013d10d5f9e17539937cbedb"} Oct 06 13:38:02 crc kubenswrapper[4867]: I1006 13:38:02.064264 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" event={"ID":"4960b423-de56-4b83-a577-f551c82c2702","Type":"ContainerStarted","Data":"54c61138d9fe4302ea7091249c096e2a3a928bc5b1aedd9a5896f37410cd956c"} Oct 06 13:38:02 crc kubenswrapper[4867]: I1006 13:38:02.064308 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" event={"ID":"4960b423-de56-4b83-a577-f551c82c2702","Type":"ContainerStarted","Data":"2e1ac18a2d354a7ece019601ccb22d1eacc852110bc8b46967abfa996e1f96c5"} Oct 06 13:38:02 crc kubenswrapper[4867]: I1006 13:38:02.113661 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" podStartSLOduration=1.660887708 podStartE2EDuration="2.113638035s" podCreationTimestamp="2025-10-06 13:38:00 +0000 UTC" firstStartedPulling="2025-10-06 13:38:01.127416917 +0000 UTC m=+2060.585365061" lastFinishedPulling="2025-10-06 13:38:01.580167244 +0000 UTC m=+2061.038115388" observedRunningTime="2025-10-06 13:38:02.109985655 +0000 UTC m=+2061.567933799" watchObservedRunningTime="2025-10-06 13:38:02.113638035 +0000 UTC m=+2061.571586179" Oct 06 13:38:03 crc kubenswrapper[4867]: I1006 13:38:03.075065 4867 generic.go:334] "Generic (PLEG): container finished" podID="f810aadd-e7db-4427-a6fd-34657dcf1100" containerID="6d26cfe9589718fb90731153fafcc18182f9c2e2013d10d5f9e17539937cbedb" exitCode=0 Oct 06 13:38:03 crc kubenswrapper[4867]: I1006 13:38:03.075123 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5n5sc" event={"ID":"f810aadd-e7db-4427-a6fd-34657dcf1100","Type":"ContainerDied","Data":"6d26cfe9589718fb90731153fafcc18182f9c2e2013d10d5f9e17539937cbedb"} Oct 06 13:38:04 crc kubenswrapper[4867]: I1006 13:38:04.090605 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5n5sc" event={"ID":"f810aadd-e7db-4427-a6fd-34657dcf1100","Type":"ContainerStarted","Data":"b5b8fbbcfd86d9107333532ec0df10d61c4e89125f39679d933e454974356396"} Oct 06 13:38:04 crc kubenswrapper[4867]: I1006 13:38:04.114024 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5n5sc" podStartSLOduration=2.607241103 podStartE2EDuration="5.114005346s" podCreationTimestamp="2025-10-06 13:37:59 +0000 UTC" firstStartedPulling="2025-10-06 13:38:01.054616498 +0000 UTC m=+2060.512564632" lastFinishedPulling="2025-10-06 13:38:03.561380711 +0000 UTC m=+2063.019328875" observedRunningTime="2025-10-06 13:38:04.113714528 +0000 UTC m=+2063.571662682" watchObservedRunningTime="2025-10-06 13:38:04.114005346 +0000 UTC m=+2063.571953490" Oct 06 13:38:09 crc kubenswrapper[4867]: I1006 13:38:09.901768 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w977z"] Oct 06 13:38:09 crc kubenswrapper[4867]: I1006 13:38:09.905579 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:09 crc kubenswrapper[4867]: I1006 13:38:09.915180 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w977z"] Oct 06 13:38:09 crc kubenswrapper[4867]: I1006 13:38:09.976392 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0744054a-4883-42d8-abef-dcbf202d1030-catalog-content\") pod \"community-operators-w977z\" (UID: \"0744054a-4883-42d8-abef-dcbf202d1030\") " pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:09 crc kubenswrapper[4867]: I1006 13:38:09.976491 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0744054a-4883-42d8-abef-dcbf202d1030-utilities\") pod \"community-operators-w977z\" (UID: \"0744054a-4883-42d8-abef-dcbf202d1030\") " pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:09 crc kubenswrapper[4867]: I1006 13:38:09.976510 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7tnr\" (UniqueName: \"kubernetes.io/projected/0744054a-4883-42d8-abef-dcbf202d1030-kube-api-access-c7tnr\") pod \"community-operators-w977z\" (UID: \"0744054a-4883-42d8-abef-dcbf202d1030\") " pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:10 crc kubenswrapper[4867]: I1006 13:38:10.078567 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0744054a-4883-42d8-abef-dcbf202d1030-catalog-content\") pod \"community-operators-w977z\" (UID: \"0744054a-4883-42d8-abef-dcbf202d1030\") " pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:10 crc kubenswrapper[4867]: I1006 13:38:10.078685 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0744054a-4883-42d8-abef-dcbf202d1030-utilities\") pod \"community-operators-w977z\" (UID: \"0744054a-4883-42d8-abef-dcbf202d1030\") " pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:10 crc kubenswrapper[4867]: I1006 13:38:10.078707 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7tnr\" (UniqueName: \"kubernetes.io/projected/0744054a-4883-42d8-abef-dcbf202d1030-kube-api-access-c7tnr\") pod \"community-operators-w977z\" (UID: \"0744054a-4883-42d8-abef-dcbf202d1030\") " pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:10 crc kubenswrapper[4867]: I1006 13:38:10.079228 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0744054a-4883-42d8-abef-dcbf202d1030-catalog-content\") pod \"community-operators-w977z\" (UID: \"0744054a-4883-42d8-abef-dcbf202d1030\") " pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:10 crc kubenswrapper[4867]: I1006 13:38:10.079653 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0744054a-4883-42d8-abef-dcbf202d1030-utilities\") pod \"community-operators-w977z\" (UID: \"0744054a-4883-42d8-abef-dcbf202d1030\") " pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:10 crc kubenswrapper[4867]: I1006 13:38:10.087785 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:38:10 crc kubenswrapper[4867]: I1006 13:38:10.088422 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:38:10 crc kubenswrapper[4867]: I1006 13:38:10.120220 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7tnr\" (UniqueName: \"kubernetes.io/projected/0744054a-4883-42d8-abef-dcbf202d1030-kube-api-access-c7tnr\") pod \"community-operators-w977z\" (UID: \"0744054a-4883-42d8-abef-dcbf202d1030\") " pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:10 crc kubenswrapper[4867]: I1006 13:38:10.187479 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:38:10 crc kubenswrapper[4867]: I1006 13:38:10.237653 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:10 crc kubenswrapper[4867]: I1006 13:38:10.791470 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w977z"] Oct 06 13:38:10 crc kubenswrapper[4867]: W1006 13:38:10.803966 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0744054a_4883_42d8_abef_dcbf202d1030.slice/crio-74529481418a64568e0db8ffcdd724b121512b5872b779b720da07506c450afc WatchSource:0}: Error finding container 74529481418a64568e0db8ffcdd724b121512b5872b779b720da07506c450afc: Status 404 returned error can't find the container with id 74529481418a64568e0db8ffcdd724b121512b5872b779b720da07506c450afc Oct 06 13:38:11 crc kubenswrapper[4867]: I1006 13:38:11.163751 4867 generic.go:334] "Generic (PLEG): container finished" podID="0744054a-4883-42d8-abef-dcbf202d1030" containerID="b18d1f9c6a7ea10f1244220e27b181450b1d69d7c37b7d9ba69e6fdbd02df826" exitCode=0 Oct 06 13:38:11 crc kubenswrapper[4867]: I1006 13:38:11.164194 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w977z" event={"ID":"0744054a-4883-42d8-abef-dcbf202d1030","Type":"ContainerDied","Data":"b18d1f9c6a7ea10f1244220e27b181450b1d69d7c37b7d9ba69e6fdbd02df826"} Oct 06 13:38:11 crc kubenswrapper[4867]: I1006 13:38:11.164273 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w977z" event={"ID":"0744054a-4883-42d8-abef-dcbf202d1030","Type":"ContainerStarted","Data":"74529481418a64568e0db8ffcdd724b121512b5872b779b720da07506c450afc"} Oct 06 13:38:11 crc kubenswrapper[4867]: I1006 13:38:11.236563 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:38:12 crc kubenswrapper[4867]: I1006 13:38:12.174859 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w977z" event={"ID":"0744054a-4883-42d8-abef-dcbf202d1030","Type":"ContainerStarted","Data":"62bc9fc44b95c7e3c67421c03fec7b4f67e9ddcf981e64f68e892d59495808c6"} Oct 06 13:38:13 crc kubenswrapper[4867]: I1006 13:38:13.191786 4867 generic.go:334] "Generic (PLEG): container finished" podID="0744054a-4883-42d8-abef-dcbf202d1030" containerID="62bc9fc44b95c7e3c67421c03fec7b4f67e9ddcf981e64f68e892d59495808c6" exitCode=0 Oct 06 13:38:13 crc kubenswrapper[4867]: I1006 13:38:13.191846 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w977z" event={"ID":"0744054a-4883-42d8-abef-dcbf202d1030","Type":"ContainerDied","Data":"62bc9fc44b95c7e3c67421c03fec7b4f67e9ddcf981e64f68e892d59495808c6"} Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.060800 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5n5sc"] Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.061473 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5n5sc" podUID="f810aadd-e7db-4427-a6fd-34657dcf1100" containerName="registry-server" containerID="cri-o://b5b8fbbcfd86d9107333532ec0df10d61c4e89125f39679d933e454974356396" gracePeriod=2 Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.207411 4867 generic.go:334] "Generic (PLEG): container finished" podID="f810aadd-e7db-4427-a6fd-34657dcf1100" containerID="b5b8fbbcfd86d9107333532ec0df10d61c4e89125f39679d933e454974356396" exitCode=0 Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.207463 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5n5sc" event={"ID":"f810aadd-e7db-4427-a6fd-34657dcf1100","Type":"ContainerDied","Data":"b5b8fbbcfd86d9107333532ec0df10d61c4e89125f39679d933e454974356396"} Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.210853 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w977z" event={"ID":"0744054a-4883-42d8-abef-dcbf202d1030","Type":"ContainerStarted","Data":"0425fc595d1cceae8f43597c9f319c3a63d868116e55c3e12413c0fb73a90e20"} Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.238504 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w977z" podStartSLOduration=2.585689567 podStartE2EDuration="5.238482839s" podCreationTimestamp="2025-10-06 13:38:09 +0000 UTC" firstStartedPulling="2025-10-06 13:38:11.166387264 +0000 UTC m=+2070.624335408" lastFinishedPulling="2025-10-06 13:38:13.819180526 +0000 UTC m=+2073.277128680" observedRunningTime="2025-10-06 13:38:14.234925812 +0000 UTC m=+2073.692873956" watchObservedRunningTime="2025-10-06 13:38:14.238482839 +0000 UTC m=+2073.696430983" Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.565451 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.608052 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z26j7\" (UniqueName: \"kubernetes.io/projected/f810aadd-e7db-4427-a6fd-34657dcf1100-kube-api-access-z26j7\") pod \"f810aadd-e7db-4427-a6fd-34657dcf1100\" (UID: \"f810aadd-e7db-4427-a6fd-34657dcf1100\") " Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.608171 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f810aadd-e7db-4427-a6fd-34657dcf1100-catalog-content\") pod \"f810aadd-e7db-4427-a6fd-34657dcf1100\" (UID: \"f810aadd-e7db-4427-a6fd-34657dcf1100\") " Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.608396 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f810aadd-e7db-4427-a6fd-34657dcf1100-utilities\") pod \"f810aadd-e7db-4427-a6fd-34657dcf1100\" (UID: \"f810aadd-e7db-4427-a6fd-34657dcf1100\") " Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.609398 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f810aadd-e7db-4427-a6fd-34657dcf1100-utilities" (OuterVolumeSpecName: "utilities") pod "f810aadd-e7db-4427-a6fd-34657dcf1100" (UID: "f810aadd-e7db-4427-a6fd-34657dcf1100"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.613929 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f810aadd-e7db-4427-a6fd-34657dcf1100-kube-api-access-z26j7" (OuterVolumeSpecName: "kube-api-access-z26j7") pod "f810aadd-e7db-4427-a6fd-34657dcf1100" (UID: "f810aadd-e7db-4427-a6fd-34657dcf1100"). InnerVolumeSpecName "kube-api-access-z26j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.656620 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f810aadd-e7db-4427-a6fd-34657dcf1100-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f810aadd-e7db-4427-a6fd-34657dcf1100" (UID: "f810aadd-e7db-4427-a6fd-34657dcf1100"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.710854 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f810aadd-e7db-4427-a6fd-34657dcf1100-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.711215 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f810aadd-e7db-4427-a6fd-34657dcf1100-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:38:14 crc kubenswrapper[4867]: I1006 13:38:14.711226 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z26j7\" (UniqueName: \"kubernetes.io/projected/f810aadd-e7db-4427-a6fd-34657dcf1100-kube-api-access-z26j7\") on node \"crc\" DevicePath \"\"" Oct 06 13:38:15 crc kubenswrapper[4867]: I1006 13:38:15.229126 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5n5sc" Oct 06 13:38:15 crc kubenswrapper[4867]: I1006 13:38:15.235615 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5n5sc" event={"ID":"f810aadd-e7db-4427-a6fd-34657dcf1100","Type":"ContainerDied","Data":"26e4f5f2e5d4cffa56ee11515d923ccf94b143985af508782ff80d408642bb15"} Oct 06 13:38:15 crc kubenswrapper[4867]: I1006 13:38:15.235670 4867 scope.go:117] "RemoveContainer" containerID="b5b8fbbcfd86d9107333532ec0df10d61c4e89125f39679d933e454974356396" Oct 06 13:38:15 crc kubenswrapper[4867]: I1006 13:38:15.281470 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5n5sc"] Oct 06 13:38:15 crc kubenswrapper[4867]: I1006 13:38:15.281726 4867 scope.go:117] "RemoveContainer" containerID="6d26cfe9589718fb90731153fafcc18182f9c2e2013d10d5f9e17539937cbedb" Oct 06 13:38:15 crc kubenswrapper[4867]: I1006 13:38:15.301579 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5n5sc"] Oct 06 13:38:15 crc kubenswrapper[4867]: I1006 13:38:15.305485 4867 scope.go:117] "RemoveContainer" containerID="bdf3b96a622def0eedfbee3301dff6badff884bd8a9da18e003bea30321a67ff" Oct 06 13:38:17 crc kubenswrapper[4867]: I1006 13:38:17.238481 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f810aadd-e7db-4427-a6fd-34657dcf1100" path="/var/lib/kubelet/pods/f810aadd-e7db-4427-a6fd-34657dcf1100/volumes" Oct 06 13:38:18 crc kubenswrapper[4867]: I1006 13:38:18.868722 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rp7zl"] Oct 06 13:38:18 crc kubenswrapper[4867]: E1006 13:38:18.869150 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f810aadd-e7db-4427-a6fd-34657dcf1100" containerName="extract-utilities" Oct 06 13:38:18 crc kubenswrapper[4867]: I1006 13:38:18.869163 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f810aadd-e7db-4427-a6fd-34657dcf1100" containerName="extract-utilities" Oct 06 13:38:18 crc kubenswrapper[4867]: E1006 13:38:18.869211 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f810aadd-e7db-4427-a6fd-34657dcf1100" containerName="registry-server" Oct 06 13:38:18 crc kubenswrapper[4867]: I1006 13:38:18.869217 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f810aadd-e7db-4427-a6fd-34657dcf1100" containerName="registry-server" Oct 06 13:38:18 crc kubenswrapper[4867]: E1006 13:38:18.869233 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f810aadd-e7db-4427-a6fd-34657dcf1100" containerName="extract-content" Oct 06 13:38:18 crc kubenswrapper[4867]: I1006 13:38:18.869238 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f810aadd-e7db-4427-a6fd-34657dcf1100" containerName="extract-content" Oct 06 13:38:18 crc kubenswrapper[4867]: I1006 13:38:18.869431 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f810aadd-e7db-4427-a6fd-34657dcf1100" containerName="registry-server" Oct 06 13:38:18 crc kubenswrapper[4867]: I1006 13:38:18.870863 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:18 crc kubenswrapper[4867]: I1006 13:38:18.885308 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rp7zl"] Oct 06 13:38:18 crc kubenswrapper[4867]: I1006 13:38:18.932005 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d67f9ae-4442-45e2-bae0-8cda5732e659-utilities\") pod \"redhat-operators-rp7zl\" (UID: \"7d67f9ae-4442-45e2-bae0-8cda5732e659\") " pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:18 crc kubenswrapper[4867]: I1006 13:38:18.932073 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d67f9ae-4442-45e2-bae0-8cda5732e659-catalog-content\") pod \"redhat-operators-rp7zl\" (UID: \"7d67f9ae-4442-45e2-bae0-8cda5732e659\") " pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:18 crc kubenswrapper[4867]: I1006 13:38:18.932316 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7f86\" (UniqueName: \"kubernetes.io/projected/7d67f9ae-4442-45e2-bae0-8cda5732e659-kube-api-access-v7f86\") pod \"redhat-operators-rp7zl\" (UID: \"7d67f9ae-4442-45e2-bae0-8cda5732e659\") " pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:19 crc kubenswrapper[4867]: I1006 13:38:19.033893 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7f86\" (UniqueName: \"kubernetes.io/projected/7d67f9ae-4442-45e2-bae0-8cda5732e659-kube-api-access-v7f86\") pod \"redhat-operators-rp7zl\" (UID: \"7d67f9ae-4442-45e2-bae0-8cda5732e659\") " pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:19 crc kubenswrapper[4867]: I1006 13:38:19.034037 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d67f9ae-4442-45e2-bae0-8cda5732e659-utilities\") pod \"redhat-operators-rp7zl\" (UID: \"7d67f9ae-4442-45e2-bae0-8cda5732e659\") " pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:19 crc kubenswrapper[4867]: I1006 13:38:19.034075 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d67f9ae-4442-45e2-bae0-8cda5732e659-catalog-content\") pod \"redhat-operators-rp7zl\" (UID: \"7d67f9ae-4442-45e2-bae0-8cda5732e659\") " pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:19 crc kubenswrapper[4867]: I1006 13:38:19.034657 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d67f9ae-4442-45e2-bae0-8cda5732e659-catalog-content\") pod \"redhat-operators-rp7zl\" (UID: \"7d67f9ae-4442-45e2-bae0-8cda5732e659\") " pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:19 crc kubenswrapper[4867]: I1006 13:38:19.035225 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d67f9ae-4442-45e2-bae0-8cda5732e659-utilities\") pod \"redhat-operators-rp7zl\" (UID: \"7d67f9ae-4442-45e2-bae0-8cda5732e659\") " pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:19 crc kubenswrapper[4867]: I1006 13:38:19.059896 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7f86\" (UniqueName: \"kubernetes.io/projected/7d67f9ae-4442-45e2-bae0-8cda5732e659-kube-api-access-v7f86\") pod \"redhat-operators-rp7zl\" (UID: \"7d67f9ae-4442-45e2-bae0-8cda5732e659\") " pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:19 crc kubenswrapper[4867]: I1006 13:38:19.208741 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:19 crc kubenswrapper[4867]: I1006 13:38:19.728806 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rp7zl"] Oct 06 13:38:20 crc kubenswrapper[4867]: I1006 13:38:20.238715 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:20 crc kubenswrapper[4867]: I1006 13:38:20.239308 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:20 crc kubenswrapper[4867]: I1006 13:38:20.288589 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:20 crc kubenswrapper[4867]: I1006 13:38:20.315615 4867 generic.go:334] "Generic (PLEG): container finished" podID="7d67f9ae-4442-45e2-bae0-8cda5732e659" containerID="1f40ca6ec5b75bc2b2b2ca4fd354c4bd0a8a2013c19b845309a72da82fd8e6d2" exitCode=0 Oct 06 13:38:20 crc kubenswrapper[4867]: I1006 13:38:20.315683 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rp7zl" event={"ID":"7d67f9ae-4442-45e2-bae0-8cda5732e659","Type":"ContainerDied","Data":"1f40ca6ec5b75bc2b2b2ca4fd354c4bd0a8a2013c19b845309a72da82fd8e6d2"} Oct 06 13:38:20 crc kubenswrapper[4867]: I1006 13:38:20.315757 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rp7zl" event={"ID":"7d67f9ae-4442-45e2-bae0-8cda5732e659","Type":"ContainerStarted","Data":"159c48a1bd2d4c3c07baeaba3056fbd1c2334efab81129f2d39699caaa8c8222"} Oct 06 13:38:20 crc kubenswrapper[4867]: I1006 13:38:20.379435 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:22 crc kubenswrapper[4867]: I1006 13:38:22.341865 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rp7zl" event={"ID":"7d67f9ae-4442-45e2-bae0-8cda5732e659","Type":"ContainerStarted","Data":"28c22c60eacdbab758c5c8d0a257dcea09149b853cc33da5fc23cdd1ebb9ec85"} Oct 06 13:38:23 crc kubenswrapper[4867]: I1006 13:38:23.351894 4867 generic.go:334] "Generic (PLEG): container finished" podID="7d67f9ae-4442-45e2-bae0-8cda5732e659" containerID="28c22c60eacdbab758c5c8d0a257dcea09149b853cc33da5fc23cdd1ebb9ec85" exitCode=0 Oct 06 13:38:23 crc kubenswrapper[4867]: I1006 13:38:23.351936 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rp7zl" event={"ID":"7d67f9ae-4442-45e2-bae0-8cda5732e659","Type":"ContainerDied","Data":"28c22c60eacdbab758c5c8d0a257dcea09149b853cc33da5fc23cdd1ebb9ec85"} Oct 06 13:38:23 crc kubenswrapper[4867]: I1006 13:38:23.474762 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w977z"] Oct 06 13:38:23 crc kubenswrapper[4867]: I1006 13:38:23.475434 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w977z" podUID="0744054a-4883-42d8-abef-dcbf202d1030" containerName="registry-server" containerID="cri-o://0425fc595d1cceae8f43597c9f319c3a63d868116e55c3e12413c0fb73a90e20" gracePeriod=2 Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.084180 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.156088 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0744054a-4883-42d8-abef-dcbf202d1030-utilities\") pod \"0744054a-4883-42d8-abef-dcbf202d1030\" (UID: \"0744054a-4883-42d8-abef-dcbf202d1030\") " Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.156407 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0744054a-4883-42d8-abef-dcbf202d1030-catalog-content\") pod \"0744054a-4883-42d8-abef-dcbf202d1030\" (UID: \"0744054a-4883-42d8-abef-dcbf202d1030\") " Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.156450 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7tnr\" (UniqueName: \"kubernetes.io/projected/0744054a-4883-42d8-abef-dcbf202d1030-kube-api-access-c7tnr\") pod \"0744054a-4883-42d8-abef-dcbf202d1030\" (UID: \"0744054a-4883-42d8-abef-dcbf202d1030\") " Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.157277 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0744054a-4883-42d8-abef-dcbf202d1030-utilities" (OuterVolumeSpecName: "utilities") pod "0744054a-4883-42d8-abef-dcbf202d1030" (UID: "0744054a-4883-42d8-abef-dcbf202d1030"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.164663 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0744054a-4883-42d8-abef-dcbf202d1030-kube-api-access-c7tnr" (OuterVolumeSpecName: "kube-api-access-c7tnr") pod "0744054a-4883-42d8-abef-dcbf202d1030" (UID: "0744054a-4883-42d8-abef-dcbf202d1030"). InnerVolumeSpecName "kube-api-access-c7tnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.203024 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0744054a-4883-42d8-abef-dcbf202d1030-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0744054a-4883-42d8-abef-dcbf202d1030" (UID: "0744054a-4883-42d8-abef-dcbf202d1030"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.258569 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0744054a-4883-42d8-abef-dcbf202d1030-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.258626 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7tnr\" (UniqueName: \"kubernetes.io/projected/0744054a-4883-42d8-abef-dcbf202d1030-kube-api-access-c7tnr\") on node \"crc\" DevicePath \"\"" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.258643 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0744054a-4883-42d8-abef-dcbf202d1030-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.363897 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rp7zl" event={"ID":"7d67f9ae-4442-45e2-bae0-8cda5732e659","Type":"ContainerStarted","Data":"cf504ed93076e51c8a999170b7622e505fdae65b4bc3ee63166dee1a57c74566"} Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.367464 4867 generic.go:334] "Generic (PLEG): container finished" podID="0744054a-4883-42d8-abef-dcbf202d1030" containerID="0425fc595d1cceae8f43597c9f319c3a63d868116e55c3e12413c0fb73a90e20" exitCode=0 Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.367548 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w977z" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.367519 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w977z" event={"ID":"0744054a-4883-42d8-abef-dcbf202d1030","Type":"ContainerDied","Data":"0425fc595d1cceae8f43597c9f319c3a63d868116e55c3e12413c0fb73a90e20"} Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.367726 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w977z" event={"ID":"0744054a-4883-42d8-abef-dcbf202d1030","Type":"ContainerDied","Data":"74529481418a64568e0db8ffcdd724b121512b5872b779b720da07506c450afc"} Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.367765 4867 scope.go:117] "RemoveContainer" containerID="0425fc595d1cceae8f43597c9f319c3a63d868116e55c3e12413c0fb73a90e20" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.390264 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rp7zl" podStartSLOduration=2.922176945 podStartE2EDuration="6.390226733s" podCreationTimestamp="2025-10-06 13:38:18 +0000 UTC" firstStartedPulling="2025-10-06 13:38:20.317843501 +0000 UTC m=+2079.775791635" lastFinishedPulling="2025-10-06 13:38:23.785893279 +0000 UTC m=+2083.243841423" observedRunningTime="2025-10-06 13:38:24.381398582 +0000 UTC m=+2083.839346726" watchObservedRunningTime="2025-10-06 13:38:24.390226733 +0000 UTC m=+2083.848174877" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.392145 4867 scope.go:117] "RemoveContainer" containerID="62bc9fc44b95c7e3c67421c03fec7b4f67e9ddcf981e64f68e892d59495808c6" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.404241 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w977z"] Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.412703 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w977z"] Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.424014 4867 scope.go:117] "RemoveContainer" containerID="b18d1f9c6a7ea10f1244220e27b181450b1d69d7c37b7d9ba69e6fdbd02df826" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.476882 4867 scope.go:117] "RemoveContainer" containerID="0425fc595d1cceae8f43597c9f319c3a63d868116e55c3e12413c0fb73a90e20" Oct 06 13:38:24 crc kubenswrapper[4867]: E1006 13:38:24.478554 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0425fc595d1cceae8f43597c9f319c3a63d868116e55c3e12413c0fb73a90e20\": container with ID starting with 0425fc595d1cceae8f43597c9f319c3a63d868116e55c3e12413c0fb73a90e20 not found: ID does not exist" containerID="0425fc595d1cceae8f43597c9f319c3a63d868116e55c3e12413c0fb73a90e20" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.478597 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0425fc595d1cceae8f43597c9f319c3a63d868116e55c3e12413c0fb73a90e20"} err="failed to get container status \"0425fc595d1cceae8f43597c9f319c3a63d868116e55c3e12413c0fb73a90e20\": rpc error: code = NotFound desc = could not find container \"0425fc595d1cceae8f43597c9f319c3a63d868116e55c3e12413c0fb73a90e20\": container with ID starting with 0425fc595d1cceae8f43597c9f319c3a63d868116e55c3e12413c0fb73a90e20 not found: ID does not exist" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.478625 4867 scope.go:117] "RemoveContainer" containerID="62bc9fc44b95c7e3c67421c03fec7b4f67e9ddcf981e64f68e892d59495808c6" Oct 06 13:38:24 crc kubenswrapper[4867]: E1006 13:38:24.479105 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62bc9fc44b95c7e3c67421c03fec7b4f67e9ddcf981e64f68e892d59495808c6\": container with ID starting with 62bc9fc44b95c7e3c67421c03fec7b4f67e9ddcf981e64f68e892d59495808c6 not found: ID does not exist" containerID="62bc9fc44b95c7e3c67421c03fec7b4f67e9ddcf981e64f68e892d59495808c6" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.479170 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62bc9fc44b95c7e3c67421c03fec7b4f67e9ddcf981e64f68e892d59495808c6"} err="failed to get container status \"62bc9fc44b95c7e3c67421c03fec7b4f67e9ddcf981e64f68e892d59495808c6\": rpc error: code = NotFound desc = could not find container \"62bc9fc44b95c7e3c67421c03fec7b4f67e9ddcf981e64f68e892d59495808c6\": container with ID starting with 62bc9fc44b95c7e3c67421c03fec7b4f67e9ddcf981e64f68e892d59495808c6 not found: ID does not exist" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.479201 4867 scope.go:117] "RemoveContainer" containerID="b18d1f9c6a7ea10f1244220e27b181450b1d69d7c37b7d9ba69e6fdbd02df826" Oct 06 13:38:24 crc kubenswrapper[4867]: E1006 13:38:24.479546 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18d1f9c6a7ea10f1244220e27b181450b1d69d7c37b7d9ba69e6fdbd02df826\": container with ID starting with b18d1f9c6a7ea10f1244220e27b181450b1d69d7c37b7d9ba69e6fdbd02df826 not found: ID does not exist" containerID="b18d1f9c6a7ea10f1244220e27b181450b1d69d7c37b7d9ba69e6fdbd02df826" Oct 06 13:38:24 crc kubenswrapper[4867]: I1006 13:38:24.479582 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18d1f9c6a7ea10f1244220e27b181450b1d69d7c37b7d9ba69e6fdbd02df826"} err="failed to get container status \"b18d1f9c6a7ea10f1244220e27b181450b1d69d7c37b7d9ba69e6fdbd02df826\": rpc error: code = NotFound desc = could not find container \"b18d1f9c6a7ea10f1244220e27b181450b1d69d7c37b7d9ba69e6fdbd02df826\": container with ID starting with b18d1f9c6a7ea10f1244220e27b181450b1d69d7c37b7d9ba69e6fdbd02df826 not found: ID does not exist" Oct 06 13:38:25 crc kubenswrapper[4867]: I1006 13:38:25.235371 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0744054a-4883-42d8-abef-dcbf202d1030" path="/var/lib/kubelet/pods/0744054a-4883-42d8-abef-dcbf202d1030/volumes" Oct 06 13:38:29 crc kubenswrapper[4867]: I1006 13:38:29.208940 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:29 crc kubenswrapper[4867]: I1006 13:38:29.209555 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:29 crc kubenswrapper[4867]: I1006 13:38:29.274438 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:29 crc kubenswrapper[4867]: I1006 13:38:29.485179 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:31 crc kubenswrapper[4867]: I1006 13:38:31.260487 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rp7zl"] Oct 06 13:38:31 crc kubenswrapper[4867]: I1006 13:38:31.436783 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rp7zl" podUID="7d67f9ae-4442-45e2-bae0-8cda5732e659" containerName="registry-server" containerID="cri-o://cf504ed93076e51c8a999170b7622e505fdae65b4bc3ee63166dee1a57c74566" gracePeriod=2 Oct 06 13:38:31 crc kubenswrapper[4867]: I1006 13:38:31.867529 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:31 crc kubenswrapper[4867]: I1006 13:38:31.929970 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d67f9ae-4442-45e2-bae0-8cda5732e659-catalog-content\") pod \"7d67f9ae-4442-45e2-bae0-8cda5732e659\" (UID: \"7d67f9ae-4442-45e2-bae0-8cda5732e659\") " Oct 06 13:38:31 crc kubenswrapper[4867]: I1006 13:38:31.930100 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7f86\" (UniqueName: \"kubernetes.io/projected/7d67f9ae-4442-45e2-bae0-8cda5732e659-kube-api-access-v7f86\") pod \"7d67f9ae-4442-45e2-bae0-8cda5732e659\" (UID: \"7d67f9ae-4442-45e2-bae0-8cda5732e659\") " Oct 06 13:38:31 crc kubenswrapper[4867]: I1006 13:38:31.930420 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d67f9ae-4442-45e2-bae0-8cda5732e659-utilities\") pod \"7d67f9ae-4442-45e2-bae0-8cda5732e659\" (UID: \"7d67f9ae-4442-45e2-bae0-8cda5732e659\") " Oct 06 13:38:31 crc kubenswrapper[4867]: I1006 13:38:31.931146 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d67f9ae-4442-45e2-bae0-8cda5732e659-utilities" (OuterVolumeSpecName: "utilities") pod "7d67f9ae-4442-45e2-bae0-8cda5732e659" (UID: "7d67f9ae-4442-45e2-bae0-8cda5732e659"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:38:31 crc kubenswrapper[4867]: I1006 13:38:31.939281 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d67f9ae-4442-45e2-bae0-8cda5732e659-kube-api-access-v7f86" (OuterVolumeSpecName: "kube-api-access-v7f86") pod "7d67f9ae-4442-45e2-bae0-8cda5732e659" (UID: "7d67f9ae-4442-45e2-bae0-8cda5732e659"). InnerVolumeSpecName "kube-api-access-v7f86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.021934 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d67f9ae-4442-45e2-bae0-8cda5732e659-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d67f9ae-4442-45e2-bae0-8cda5732e659" (UID: "7d67f9ae-4442-45e2-bae0-8cda5732e659"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.034190 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d67f9ae-4442-45e2-bae0-8cda5732e659-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.034221 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d67f9ae-4442-45e2-bae0-8cda5732e659-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.034235 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7f86\" (UniqueName: \"kubernetes.io/projected/7d67f9ae-4442-45e2-bae0-8cda5732e659-kube-api-access-v7f86\") on node \"crc\" DevicePath \"\"" Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.449244 4867 generic.go:334] "Generic (PLEG): container finished" podID="7d67f9ae-4442-45e2-bae0-8cda5732e659" containerID="cf504ed93076e51c8a999170b7622e505fdae65b4bc3ee63166dee1a57c74566" exitCode=0 Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.449433 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rp7zl" event={"ID":"7d67f9ae-4442-45e2-bae0-8cda5732e659","Type":"ContainerDied","Data":"cf504ed93076e51c8a999170b7622e505fdae65b4bc3ee63166dee1a57c74566"} Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.449625 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rp7zl" Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.449643 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rp7zl" event={"ID":"7d67f9ae-4442-45e2-bae0-8cda5732e659","Type":"ContainerDied","Data":"159c48a1bd2d4c3c07baeaba3056fbd1c2334efab81129f2d39699caaa8c8222"} Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.449675 4867 scope.go:117] "RemoveContainer" containerID="cf504ed93076e51c8a999170b7622e505fdae65b4bc3ee63166dee1a57c74566" Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.473625 4867 scope.go:117] "RemoveContainer" containerID="28c22c60eacdbab758c5c8d0a257dcea09149b853cc33da5fc23cdd1ebb9ec85" Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.485936 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rp7zl"] Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.504127 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rp7zl"] Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.507545 4867 scope.go:117] "RemoveContainer" containerID="1f40ca6ec5b75bc2b2b2ca4fd354c4bd0a8a2013c19b845309a72da82fd8e6d2" Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.562288 4867 scope.go:117] "RemoveContainer" containerID="cf504ed93076e51c8a999170b7622e505fdae65b4bc3ee63166dee1a57c74566" Oct 06 13:38:32 crc kubenswrapper[4867]: E1006 13:38:32.563550 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf504ed93076e51c8a999170b7622e505fdae65b4bc3ee63166dee1a57c74566\": container with ID starting with cf504ed93076e51c8a999170b7622e505fdae65b4bc3ee63166dee1a57c74566 not found: ID does not exist" containerID="cf504ed93076e51c8a999170b7622e505fdae65b4bc3ee63166dee1a57c74566" Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.563604 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf504ed93076e51c8a999170b7622e505fdae65b4bc3ee63166dee1a57c74566"} err="failed to get container status \"cf504ed93076e51c8a999170b7622e505fdae65b4bc3ee63166dee1a57c74566\": rpc error: code = NotFound desc = could not find container \"cf504ed93076e51c8a999170b7622e505fdae65b4bc3ee63166dee1a57c74566\": container with ID starting with cf504ed93076e51c8a999170b7622e505fdae65b4bc3ee63166dee1a57c74566 not found: ID does not exist" Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.563625 4867 scope.go:117] "RemoveContainer" containerID="28c22c60eacdbab758c5c8d0a257dcea09149b853cc33da5fc23cdd1ebb9ec85" Oct 06 13:38:32 crc kubenswrapper[4867]: E1006 13:38:32.564069 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28c22c60eacdbab758c5c8d0a257dcea09149b853cc33da5fc23cdd1ebb9ec85\": container with ID starting with 28c22c60eacdbab758c5c8d0a257dcea09149b853cc33da5fc23cdd1ebb9ec85 not found: ID does not exist" containerID="28c22c60eacdbab758c5c8d0a257dcea09149b853cc33da5fc23cdd1ebb9ec85" Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.564100 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28c22c60eacdbab758c5c8d0a257dcea09149b853cc33da5fc23cdd1ebb9ec85"} err="failed to get container status \"28c22c60eacdbab758c5c8d0a257dcea09149b853cc33da5fc23cdd1ebb9ec85\": rpc error: code = NotFound desc = could not find container \"28c22c60eacdbab758c5c8d0a257dcea09149b853cc33da5fc23cdd1ebb9ec85\": container with ID starting with 28c22c60eacdbab758c5c8d0a257dcea09149b853cc33da5fc23cdd1ebb9ec85 not found: ID does not exist" Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.564118 4867 scope.go:117] "RemoveContainer" containerID="1f40ca6ec5b75bc2b2b2ca4fd354c4bd0a8a2013c19b845309a72da82fd8e6d2" Oct 06 13:38:32 crc kubenswrapper[4867]: E1006 13:38:32.566797 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f40ca6ec5b75bc2b2b2ca4fd354c4bd0a8a2013c19b845309a72da82fd8e6d2\": container with ID starting with 1f40ca6ec5b75bc2b2b2ca4fd354c4bd0a8a2013c19b845309a72da82fd8e6d2 not found: ID does not exist" containerID="1f40ca6ec5b75bc2b2b2ca4fd354c4bd0a8a2013c19b845309a72da82fd8e6d2" Oct 06 13:38:32 crc kubenswrapper[4867]: I1006 13:38:32.566832 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f40ca6ec5b75bc2b2b2ca4fd354c4bd0a8a2013c19b845309a72da82fd8e6d2"} err="failed to get container status \"1f40ca6ec5b75bc2b2b2ca4fd354c4bd0a8a2013c19b845309a72da82fd8e6d2\": rpc error: code = NotFound desc = could not find container \"1f40ca6ec5b75bc2b2b2ca4fd354c4bd0a8a2013c19b845309a72da82fd8e6d2\": container with ID starting with 1f40ca6ec5b75bc2b2b2ca4fd354c4bd0a8a2013c19b845309a72da82fd8e6d2 not found: ID does not exist" Oct 06 13:38:33 crc kubenswrapper[4867]: I1006 13:38:33.234105 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d67f9ae-4442-45e2-bae0-8cda5732e659" path="/var/lib/kubelet/pods/7d67f9ae-4442-45e2-bae0-8cda5732e659/volumes" Oct 06 13:39:12 crc kubenswrapper[4867]: I1006 13:39:12.873428 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:39:12 crc kubenswrapper[4867]: I1006 13:39:12.874190 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:39:12 crc kubenswrapper[4867]: I1006 13:39:12.877694 4867 generic.go:334] "Generic (PLEG): container finished" podID="4960b423-de56-4b83-a577-f551c82c2702" containerID="54c61138d9fe4302ea7091249c096e2a3a928bc5b1aedd9a5896f37410cd956c" exitCode=0 Oct 06 13:39:12 crc kubenswrapper[4867]: I1006 13:39:12.877750 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" event={"ID":"4960b423-de56-4b83-a577-f551c82c2702","Type":"ContainerDied","Data":"54c61138d9fe4302ea7091249c096e2a3a928bc5b1aedd9a5896f37410cd956c"} Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.351388 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.478479 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5bvt\" (UniqueName: \"kubernetes.io/projected/4960b423-de56-4b83-a577-f551c82c2702-kube-api-access-v5bvt\") pod \"4960b423-de56-4b83-a577-f551c82c2702\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.479183 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-inventory\") pod \"4960b423-de56-4b83-a577-f551c82c2702\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.479438 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4960b423-de56-4b83-a577-f551c82c2702-ovncontroller-config-0\") pod \"4960b423-de56-4b83-a577-f551c82c2702\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.479548 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-ovn-combined-ca-bundle\") pod \"4960b423-de56-4b83-a577-f551c82c2702\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.479741 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-ssh-key\") pod \"4960b423-de56-4b83-a577-f551c82c2702\" (UID: \"4960b423-de56-4b83-a577-f551c82c2702\") " Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.486119 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4960b423-de56-4b83-a577-f551c82c2702-kube-api-access-v5bvt" (OuterVolumeSpecName: "kube-api-access-v5bvt") pod "4960b423-de56-4b83-a577-f551c82c2702" (UID: "4960b423-de56-4b83-a577-f551c82c2702"). InnerVolumeSpecName "kube-api-access-v5bvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.491709 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4960b423-de56-4b83-a577-f551c82c2702" (UID: "4960b423-de56-4b83-a577-f551c82c2702"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.507332 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4960b423-de56-4b83-a577-f551c82c2702-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "4960b423-de56-4b83-a577-f551c82c2702" (UID: "4960b423-de56-4b83-a577-f551c82c2702"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.509717 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-inventory" (OuterVolumeSpecName: "inventory") pod "4960b423-de56-4b83-a577-f551c82c2702" (UID: "4960b423-de56-4b83-a577-f551c82c2702"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.515081 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4960b423-de56-4b83-a577-f551c82c2702" (UID: "4960b423-de56-4b83-a577-f551c82c2702"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.585732 4867 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4960b423-de56-4b83-a577-f551c82c2702-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.585788 4867 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.585800 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.585809 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5bvt\" (UniqueName: \"kubernetes.io/projected/4960b423-de56-4b83-a577-f551c82c2702-kube-api-access-v5bvt\") on node \"crc\" DevicePath \"\"" Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.585819 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4960b423-de56-4b83-a577-f551c82c2702-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.898440 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" event={"ID":"4960b423-de56-4b83-a577-f551c82c2702","Type":"ContainerDied","Data":"2e1ac18a2d354a7ece019601ccb22d1eacc852110bc8b46967abfa996e1f96c5"} Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.898510 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e1ac18a2d354a7ece019601ccb22d1eacc852110bc8b46967abfa996e1f96c5" Oct 06 13:39:14 crc kubenswrapper[4867]: I1006 13:39:14.898551 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7vftx" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.005817 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2"] Oct 06 13:39:15 crc kubenswrapper[4867]: E1006 13:39:15.006293 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4960b423-de56-4b83-a577-f551c82c2702" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.006310 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="4960b423-de56-4b83-a577-f551c82c2702" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 13:39:15 crc kubenswrapper[4867]: E1006 13:39:15.006318 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d67f9ae-4442-45e2-bae0-8cda5732e659" containerName="registry-server" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.006325 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d67f9ae-4442-45e2-bae0-8cda5732e659" containerName="registry-server" Oct 06 13:39:15 crc kubenswrapper[4867]: E1006 13:39:15.006350 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d67f9ae-4442-45e2-bae0-8cda5732e659" containerName="extract-utilities" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.006358 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d67f9ae-4442-45e2-bae0-8cda5732e659" containerName="extract-utilities" Oct 06 13:39:15 crc kubenswrapper[4867]: E1006 13:39:15.006371 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0744054a-4883-42d8-abef-dcbf202d1030" containerName="extract-utilities" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.006377 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0744054a-4883-42d8-abef-dcbf202d1030" containerName="extract-utilities" Oct 06 13:39:15 crc kubenswrapper[4867]: E1006 13:39:15.006399 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0744054a-4883-42d8-abef-dcbf202d1030" containerName="extract-content" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.006405 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0744054a-4883-42d8-abef-dcbf202d1030" containerName="extract-content" Oct 06 13:39:15 crc kubenswrapper[4867]: E1006 13:39:15.006420 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0744054a-4883-42d8-abef-dcbf202d1030" containerName="registry-server" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.006426 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0744054a-4883-42d8-abef-dcbf202d1030" containerName="registry-server" Oct 06 13:39:15 crc kubenswrapper[4867]: E1006 13:39:15.006439 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d67f9ae-4442-45e2-bae0-8cda5732e659" containerName="extract-content" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.006444 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d67f9ae-4442-45e2-bae0-8cda5732e659" containerName="extract-content" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.006643 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d67f9ae-4442-45e2-bae0-8cda5732e659" containerName="registry-server" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.006659 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="4960b423-de56-4b83-a577-f551c82c2702" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.006674 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="0744054a-4883-42d8-abef-dcbf202d1030" containerName="registry-server" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.007640 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.010113 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.010317 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.010387 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.010447 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.010510 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.013134 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.019206 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2"] Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.095352 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.095423 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv9q7\" (UniqueName: \"kubernetes.io/projected/081b2d1c-3691-40fb-8fde-05e44428087d-kube-api-access-nv9q7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.095457 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.095873 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.096527 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.096647 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.199308 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv9q7\" (UniqueName: \"kubernetes.io/projected/081b2d1c-3691-40fb-8fde-05e44428087d-kube-api-access-nv9q7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.199385 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.199424 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.199562 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.199602 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.199676 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.203135 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.204883 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.206902 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.215499 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.221136 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.223619 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv9q7\" (UniqueName: \"kubernetes.io/projected/081b2d1c-3691-40fb-8fde-05e44428087d-kube-api-access-nv9q7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.342870 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.879307 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2"] Oct 06 13:39:15 crc kubenswrapper[4867]: I1006 13:39:15.933035 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" event={"ID":"081b2d1c-3691-40fb-8fde-05e44428087d","Type":"ContainerStarted","Data":"a31a482539cb2981d866ea3b974a6f1d0df87e5979a80990575d2f4f0ea3f055"} Oct 06 13:39:16 crc kubenswrapper[4867]: I1006 13:39:16.944161 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" event={"ID":"081b2d1c-3691-40fb-8fde-05e44428087d","Type":"ContainerStarted","Data":"cf74f97ff165aa9de4981e57096d9adea3760c97d281a1cfc5874ab2679de11e"} Oct 06 13:39:16 crc kubenswrapper[4867]: I1006 13:39:16.963962 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" podStartSLOduration=2.429453072 podStartE2EDuration="2.963943029s" podCreationTimestamp="2025-10-06 13:39:14 +0000 UTC" firstStartedPulling="2025-10-06 13:39:15.904753112 +0000 UTC m=+2135.362701256" lastFinishedPulling="2025-10-06 13:39:16.439243069 +0000 UTC m=+2135.897191213" observedRunningTime="2025-10-06 13:39:16.958739567 +0000 UTC m=+2136.416687711" watchObservedRunningTime="2025-10-06 13:39:16.963943029 +0000 UTC m=+2136.421891173" Oct 06 13:39:42 crc kubenswrapper[4867]: I1006 13:39:42.873275 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:39:42 crc kubenswrapper[4867]: I1006 13:39:42.873981 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:40:10 crc kubenswrapper[4867]: I1006 13:40:10.523637 4867 generic.go:334] "Generic (PLEG): container finished" podID="081b2d1c-3691-40fb-8fde-05e44428087d" containerID="cf74f97ff165aa9de4981e57096d9adea3760c97d281a1cfc5874ab2679de11e" exitCode=0 Oct 06 13:40:10 crc kubenswrapper[4867]: I1006 13:40:10.523806 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" event={"ID":"081b2d1c-3691-40fb-8fde-05e44428087d","Type":"ContainerDied","Data":"cf74f97ff165aa9de4981e57096d9adea3760c97d281a1cfc5874ab2679de11e"} Oct 06 13:40:11 crc kubenswrapper[4867]: I1006 13:40:11.966942 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.130829 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-nova-metadata-neutron-config-0\") pod \"081b2d1c-3691-40fb-8fde-05e44428087d\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.130895 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-neutron-metadata-combined-ca-bundle\") pod \"081b2d1c-3691-40fb-8fde-05e44428087d\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.131069 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"081b2d1c-3691-40fb-8fde-05e44428087d\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.131164 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-ssh-key\") pod \"081b2d1c-3691-40fb-8fde-05e44428087d\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.131214 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv9q7\" (UniqueName: \"kubernetes.io/projected/081b2d1c-3691-40fb-8fde-05e44428087d-kube-api-access-nv9q7\") pod \"081b2d1c-3691-40fb-8fde-05e44428087d\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.131352 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-inventory\") pod \"081b2d1c-3691-40fb-8fde-05e44428087d\" (UID: \"081b2d1c-3691-40fb-8fde-05e44428087d\") " Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.137153 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "081b2d1c-3691-40fb-8fde-05e44428087d" (UID: "081b2d1c-3691-40fb-8fde-05e44428087d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.137527 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/081b2d1c-3691-40fb-8fde-05e44428087d-kube-api-access-nv9q7" (OuterVolumeSpecName: "kube-api-access-nv9q7") pod "081b2d1c-3691-40fb-8fde-05e44428087d" (UID: "081b2d1c-3691-40fb-8fde-05e44428087d"). InnerVolumeSpecName "kube-api-access-nv9q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.165396 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-inventory" (OuterVolumeSpecName: "inventory") pod "081b2d1c-3691-40fb-8fde-05e44428087d" (UID: "081b2d1c-3691-40fb-8fde-05e44428087d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.165887 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "081b2d1c-3691-40fb-8fde-05e44428087d" (UID: "081b2d1c-3691-40fb-8fde-05e44428087d"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.171099 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "081b2d1c-3691-40fb-8fde-05e44428087d" (UID: "081b2d1c-3691-40fb-8fde-05e44428087d"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.173527 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "081b2d1c-3691-40fb-8fde-05e44428087d" (UID: "081b2d1c-3691-40fb-8fde-05e44428087d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.234219 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.234272 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv9q7\" (UniqueName: \"kubernetes.io/projected/081b2d1c-3691-40fb-8fde-05e44428087d-kube-api-access-nv9q7\") on node \"crc\" DevicePath \"\"" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.234288 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.234302 4867 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.234315 4867 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.234329 4867 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/081b2d1c-3691-40fb-8fde-05e44428087d-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.548990 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" event={"ID":"081b2d1c-3691-40fb-8fde-05e44428087d","Type":"ContainerDied","Data":"a31a482539cb2981d866ea3b974a6f1d0df87e5979a80990575d2f4f0ea3f055"} Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.549045 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a31a482539cb2981d866ea3b974a6f1d0df87e5979a80990575d2f4f0ea3f055" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.549064 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.678648 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9"] Oct 06 13:40:12 crc kubenswrapper[4867]: E1006 13:40:12.679177 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="081b2d1c-3691-40fb-8fde-05e44428087d" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.679197 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="081b2d1c-3691-40fb-8fde-05e44428087d" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.679397 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="081b2d1c-3691-40fb-8fde-05e44428087d" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.680162 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.682923 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.683274 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.683536 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.683735 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.684264 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.689237 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9"] Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.844855 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdkd9\" (UniqueName: \"kubernetes.io/projected/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-kube-api-access-wdkd9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.845215 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.845313 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.845370 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.845469 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.874085 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.874171 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.874228 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.875189 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a0b72170aaf35c6098ea3145446b3d986f52f72aeca02885baaa1bfa32d9e577"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.875389 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://a0b72170aaf35c6098ea3145446b3d986f52f72aeca02885baaa1bfa32d9e577" gracePeriod=600 Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.948322 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.948441 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.948584 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.948755 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdkd9\" (UniqueName: \"kubernetes.io/projected/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-kube-api-access-wdkd9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.948845 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.953791 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.954207 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.954632 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.955170 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:12 crc kubenswrapper[4867]: I1006 13:40:12.967068 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdkd9\" (UniqueName: \"kubernetes.io/projected/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-kube-api-access-wdkd9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:13 crc kubenswrapper[4867]: I1006 13:40:13.029338 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:40:13 crc kubenswrapper[4867]: I1006 13:40:13.564152 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="a0b72170aaf35c6098ea3145446b3d986f52f72aeca02885baaa1bfa32d9e577" exitCode=0 Oct 06 13:40:13 crc kubenswrapper[4867]: I1006 13:40:13.564530 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"a0b72170aaf35c6098ea3145446b3d986f52f72aeca02885baaa1bfa32d9e577"} Oct 06 13:40:13 crc kubenswrapper[4867]: I1006 13:40:13.564606 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d"} Oct 06 13:40:13 crc kubenswrapper[4867]: I1006 13:40:13.566113 4867 scope.go:117] "RemoveContainer" containerID="ef53c2d27e23b30eefb14e6e0668170fe02dd5276da3d7f4f81668ca5008085d" Oct 06 13:40:13 crc kubenswrapper[4867]: I1006 13:40:13.622711 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9"] Oct 06 13:40:13 crc kubenswrapper[4867]: W1006 13:40:13.632785 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod019fe1d6_0cfb_4b47_93bc_d08b1bd0f4be.slice/crio-4f19a6137c4ff498bc0c223d6529f4e2a2242d3b6335d8b636386844a415b22b WatchSource:0}: Error finding container 4f19a6137c4ff498bc0c223d6529f4e2a2242d3b6335d8b636386844a415b22b: Status 404 returned error can't find the container with id 4f19a6137c4ff498bc0c223d6529f4e2a2242d3b6335d8b636386844a415b22b Oct 06 13:40:14 crc kubenswrapper[4867]: I1006 13:40:14.576614 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" event={"ID":"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be","Type":"ContainerStarted","Data":"1e663b010530286cfa2c5e362b1eb882a725f691964566403348665b25d85480"} Oct 06 13:40:14 crc kubenswrapper[4867]: I1006 13:40:14.576900 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" event={"ID":"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be","Type":"ContainerStarted","Data":"4f19a6137c4ff498bc0c223d6529f4e2a2242d3b6335d8b636386844a415b22b"} Oct 06 13:40:14 crc kubenswrapper[4867]: I1006 13:40:14.594881 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" podStartSLOduration=2.199255206 podStartE2EDuration="2.594854403s" podCreationTimestamp="2025-10-06 13:40:12 +0000 UTC" firstStartedPulling="2025-10-06 13:40:13.638586365 +0000 UTC m=+2193.096534509" lastFinishedPulling="2025-10-06 13:40:14.034185522 +0000 UTC m=+2193.492133706" observedRunningTime="2025-10-06 13:40:14.591371558 +0000 UTC m=+2194.049319722" watchObservedRunningTime="2025-10-06 13:40:14.594854403 +0000 UTC m=+2194.052802557" Oct 06 13:42:42 crc kubenswrapper[4867]: I1006 13:42:42.873661 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:42:42 crc kubenswrapper[4867]: I1006 13:42:42.875415 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:43:12 crc kubenswrapper[4867]: I1006 13:43:12.873637 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:43:12 crc kubenswrapper[4867]: I1006 13:43:12.874312 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:43:42 crc kubenswrapper[4867]: I1006 13:43:42.873519 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:43:42 crc kubenswrapper[4867]: I1006 13:43:42.874116 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:43:42 crc kubenswrapper[4867]: I1006 13:43:42.874160 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:43:42 crc kubenswrapper[4867]: I1006 13:43:42.875059 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:43:42 crc kubenswrapper[4867]: I1006 13:43:42.875120 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" gracePeriod=600 Oct 06 13:43:42 crc kubenswrapper[4867]: E1006 13:43:42.996235 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:43:43 crc kubenswrapper[4867]: I1006 13:43:43.787484 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" exitCode=0 Oct 06 13:43:43 crc kubenswrapper[4867]: I1006 13:43:43.787573 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d"} Oct 06 13:43:43 crc kubenswrapper[4867]: I1006 13:43:43.787863 4867 scope.go:117] "RemoveContainer" containerID="a0b72170aaf35c6098ea3145446b3d986f52f72aeca02885baaa1bfa32d9e577" Oct 06 13:43:43 crc kubenswrapper[4867]: I1006 13:43:43.788798 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:43:43 crc kubenswrapper[4867]: E1006 13:43:43.789206 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:43:54 crc kubenswrapper[4867]: I1006 13:43:54.327706 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mjsr5"] Oct 06 13:43:54 crc kubenswrapper[4867]: I1006 13:43:54.335341 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:43:54 crc kubenswrapper[4867]: I1006 13:43:54.348821 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjsr5"] Oct 06 13:43:54 crc kubenswrapper[4867]: I1006 13:43:54.393494 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-catalog-content\") pod \"redhat-marketplace-mjsr5\" (UID: \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\") " pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:43:54 crc kubenswrapper[4867]: I1006 13:43:54.393799 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psxnr\" (UniqueName: \"kubernetes.io/projected/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-kube-api-access-psxnr\") pod \"redhat-marketplace-mjsr5\" (UID: \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\") " pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:43:54 crc kubenswrapper[4867]: I1006 13:43:54.394142 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-utilities\") pod \"redhat-marketplace-mjsr5\" (UID: \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\") " pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:43:54 crc kubenswrapper[4867]: I1006 13:43:54.495629 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psxnr\" (UniqueName: \"kubernetes.io/projected/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-kube-api-access-psxnr\") pod \"redhat-marketplace-mjsr5\" (UID: \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\") " pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:43:54 crc kubenswrapper[4867]: I1006 13:43:54.495754 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-utilities\") pod \"redhat-marketplace-mjsr5\" (UID: \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\") " pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:43:54 crc kubenswrapper[4867]: I1006 13:43:54.495816 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-catalog-content\") pod \"redhat-marketplace-mjsr5\" (UID: \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\") " pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:43:54 crc kubenswrapper[4867]: I1006 13:43:54.496392 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-catalog-content\") pod \"redhat-marketplace-mjsr5\" (UID: \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\") " pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:43:54 crc kubenswrapper[4867]: I1006 13:43:54.496491 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-utilities\") pod \"redhat-marketplace-mjsr5\" (UID: \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\") " pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:43:54 crc kubenswrapper[4867]: I1006 13:43:54.518655 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psxnr\" (UniqueName: \"kubernetes.io/projected/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-kube-api-access-psxnr\") pod \"redhat-marketplace-mjsr5\" (UID: \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\") " pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:43:54 crc kubenswrapper[4867]: I1006 13:43:54.661216 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:43:55 crc kubenswrapper[4867]: I1006 13:43:55.099872 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjsr5"] Oct 06 13:43:55 crc kubenswrapper[4867]: I1006 13:43:55.222190 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:43:55 crc kubenswrapper[4867]: E1006 13:43:55.222425 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:43:55 crc kubenswrapper[4867]: I1006 13:43:55.923425 4867 generic.go:334] "Generic (PLEG): container finished" podID="0ea88b70-18e8-4e29-9ce3-b5e34d66342c" containerID="60c00874ba7e56759100f2ed333549ffdd8bae6eb231eeaf331e250104c55467" exitCode=0 Oct 06 13:43:55 crc kubenswrapper[4867]: I1006 13:43:55.923494 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjsr5" event={"ID":"0ea88b70-18e8-4e29-9ce3-b5e34d66342c","Type":"ContainerDied","Data":"60c00874ba7e56759100f2ed333549ffdd8bae6eb231eeaf331e250104c55467"} Oct 06 13:43:55 crc kubenswrapper[4867]: I1006 13:43:55.923794 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjsr5" event={"ID":"0ea88b70-18e8-4e29-9ce3-b5e34d66342c","Type":"ContainerStarted","Data":"a4d61957001b3735d67b53d3eceaf485fbf5c8794e0dbae51b8c58bf7baa5925"} Oct 06 13:43:55 crc kubenswrapper[4867]: I1006 13:43:55.935835 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:43:56 crc kubenswrapper[4867]: I1006 13:43:56.936437 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjsr5" event={"ID":"0ea88b70-18e8-4e29-9ce3-b5e34d66342c","Type":"ContainerStarted","Data":"dbd127032301ba4b596278fe619a536c40f70969f6bdf302dc0b8c7ae316679a"} Oct 06 13:43:57 crc kubenswrapper[4867]: I1006 13:43:57.947878 4867 generic.go:334] "Generic (PLEG): container finished" podID="0ea88b70-18e8-4e29-9ce3-b5e34d66342c" containerID="dbd127032301ba4b596278fe619a536c40f70969f6bdf302dc0b8c7ae316679a" exitCode=0 Oct 06 13:43:57 crc kubenswrapper[4867]: I1006 13:43:57.947935 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjsr5" event={"ID":"0ea88b70-18e8-4e29-9ce3-b5e34d66342c","Type":"ContainerDied","Data":"dbd127032301ba4b596278fe619a536c40f70969f6bdf302dc0b8c7ae316679a"} Oct 06 13:43:58 crc kubenswrapper[4867]: I1006 13:43:58.962933 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjsr5" event={"ID":"0ea88b70-18e8-4e29-9ce3-b5e34d66342c","Type":"ContainerStarted","Data":"61cf567c9d7fca3c1b1ceb1588ab0acdc5d949844ebb740aa0cbe9bef90aa96c"} Oct 06 13:44:04 crc kubenswrapper[4867]: I1006 13:44:04.661584 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:44:04 crc kubenswrapper[4867]: I1006 13:44:04.662242 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:44:04 crc kubenswrapper[4867]: I1006 13:44:04.709305 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:44:04 crc kubenswrapper[4867]: I1006 13:44:04.739506 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mjsr5" podStartSLOduration=8.329902342 podStartE2EDuration="10.739481211s" podCreationTimestamp="2025-10-06 13:43:54 +0000 UTC" firstStartedPulling="2025-10-06 13:43:55.935607596 +0000 UTC m=+2415.393555740" lastFinishedPulling="2025-10-06 13:43:58.345186465 +0000 UTC m=+2417.803134609" observedRunningTime="2025-10-06 13:43:58.984300289 +0000 UTC m=+2418.442248443" watchObservedRunningTime="2025-10-06 13:44:04.739481211 +0000 UTC m=+2424.197429365" Oct 06 13:44:05 crc kubenswrapper[4867]: I1006 13:44:05.095289 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:44:05 crc kubenswrapper[4867]: I1006 13:44:05.157683 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjsr5"] Oct 06 13:44:07 crc kubenswrapper[4867]: I1006 13:44:07.069650 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mjsr5" podUID="0ea88b70-18e8-4e29-9ce3-b5e34d66342c" containerName="registry-server" containerID="cri-o://61cf567c9d7fca3c1b1ceb1588ab0acdc5d949844ebb740aa0cbe9bef90aa96c" gracePeriod=2 Oct 06 13:44:07 crc kubenswrapper[4867]: I1006 13:44:07.527454 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:44:07 crc kubenswrapper[4867]: I1006 13:44:07.579895 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-utilities\") pod \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\" (UID: \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\") " Oct 06 13:44:07 crc kubenswrapper[4867]: I1006 13:44:07.580225 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-catalog-content\") pod \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\" (UID: \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\") " Oct 06 13:44:07 crc kubenswrapper[4867]: I1006 13:44:07.580294 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psxnr\" (UniqueName: \"kubernetes.io/projected/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-kube-api-access-psxnr\") pod \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\" (UID: \"0ea88b70-18e8-4e29-9ce3-b5e34d66342c\") " Oct 06 13:44:07 crc kubenswrapper[4867]: I1006 13:44:07.581035 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-utilities" (OuterVolumeSpecName: "utilities") pod "0ea88b70-18e8-4e29-9ce3-b5e34d66342c" (UID: "0ea88b70-18e8-4e29-9ce3-b5e34d66342c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:44:07 crc kubenswrapper[4867]: I1006 13:44:07.587895 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-kube-api-access-psxnr" (OuterVolumeSpecName: "kube-api-access-psxnr") pod "0ea88b70-18e8-4e29-9ce3-b5e34d66342c" (UID: "0ea88b70-18e8-4e29-9ce3-b5e34d66342c"). InnerVolumeSpecName "kube-api-access-psxnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:44:07 crc kubenswrapper[4867]: I1006 13:44:07.593764 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ea88b70-18e8-4e29-9ce3-b5e34d66342c" (UID: "0ea88b70-18e8-4e29-9ce3-b5e34d66342c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:44:07 crc kubenswrapper[4867]: I1006 13:44:07.683471 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:44:07 crc kubenswrapper[4867]: I1006 13:44:07.683837 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psxnr\" (UniqueName: \"kubernetes.io/projected/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-kube-api-access-psxnr\") on node \"crc\" DevicePath \"\"" Oct 06 13:44:07 crc kubenswrapper[4867]: I1006 13:44:07.683851 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea88b70-18e8-4e29-9ce3-b5e34d66342c-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.088123 4867 generic.go:334] "Generic (PLEG): container finished" podID="0ea88b70-18e8-4e29-9ce3-b5e34d66342c" containerID="61cf567c9d7fca3c1b1ceb1588ab0acdc5d949844ebb740aa0cbe9bef90aa96c" exitCode=0 Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.088171 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjsr5" event={"ID":"0ea88b70-18e8-4e29-9ce3-b5e34d66342c","Type":"ContainerDied","Data":"61cf567c9d7fca3c1b1ceb1588ab0acdc5d949844ebb740aa0cbe9bef90aa96c"} Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.088209 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mjsr5" event={"ID":"0ea88b70-18e8-4e29-9ce3-b5e34d66342c","Type":"ContainerDied","Data":"a4d61957001b3735d67b53d3eceaf485fbf5c8794e0dbae51b8c58bf7baa5925"} Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.088229 4867 scope.go:117] "RemoveContainer" containerID="61cf567c9d7fca3c1b1ceb1588ab0acdc5d949844ebb740aa0cbe9bef90aa96c" Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.088331 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mjsr5" Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.150322 4867 scope.go:117] "RemoveContainer" containerID="dbd127032301ba4b596278fe619a536c40f70969f6bdf302dc0b8c7ae316679a" Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.157840 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjsr5"] Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.171286 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mjsr5"] Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.191543 4867 scope.go:117] "RemoveContainer" containerID="60c00874ba7e56759100f2ed333549ffdd8bae6eb231eeaf331e250104c55467" Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.246309 4867 scope.go:117] "RemoveContainer" containerID="61cf567c9d7fca3c1b1ceb1588ab0acdc5d949844ebb740aa0cbe9bef90aa96c" Oct 06 13:44:08 crc kubenswrapper[4867]: E1006 13:44:08.247760 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61cf567c9d7fca3c1b1ceb1588ab0acdc5d949844ebb740aa0cbe9bef90aa96c\": container with ID starting with 61cf567c9d7fca3c1b1ceb1588ab0acdc5d949844ebb740aa0cbe9bef90aa96c not found: ID does not exist" containerID="61cf567c9d7fca3c1b1ceb1588ab0acdc5d949844ebb740aa0cbe9bef90aa96c" Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.247809 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61cf567c9d7fca3c1b1ceb1588ab0acdc5d949844ebb740aa0cbe9bef90aa96c"} err="failed to get container status \"61cf567c9d7fca3c1b1ceb1588ab0acdc5d949844ebb740aa0cbe9bef90aa96c\": rpc error: code = NotFound desc = could not find container \"61cf567c9d7fca3c1b1ceb1588ab0acdc5d949844ebb740aa0cbe9bef90aa96c\": container with ID starting with 61cf567c9d7fca3c1b1ceb1588ab0acdc5d949844ebb740aa0cbe9bef90aa96c not found: ID does not exist" Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.247842 4867 scope.go:117] "RemoveContainer" containerID="dbd127032301ba4b596278fe619a536c40f70969f6bdf302dc0b8c7ae316679a" Oct 06 13:44:08 crc kubenswrapper[4867]: E1006 13:44:08.248308 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbd127032301ba4b596278fe619a536c40f70969f6bdf302dc0b8c7ae316679a\": container with ID starting with dbd127032301ba4b596278fe619a536c40f70969f6bdf302dc0b8c7ae316679a not found: ID does not exist" containerID="dbd127032301ba4b596278fe619a536c40f70969f6bdf302dc0b8c7ae316679a" Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.248329 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd127032301ba4b596278fe619a536c40f70969f6bdf302dc0b8c7ae316679a"} err="failed to get container status \"dbd127032301ba4b596278fe619a536c40f70969f6bdf302dc0b8c7ae316679a\": rpc error: code = NotFound desc = could not find container \"dbd127032301ba4b596278fe619a536c40f70969f6bdf302dc0b8c7ae316679a\": container with ID starting with dbd127032301ba4b596278fe619a536c40f70969f6bdf302dc0b8c7ae316679a not found: ID does not exist" Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.248345 4867 scope.go:117] "RemoveContainer" containerID="60c00874ba7e56759100f2ed333549ffdd8bae6eb231eeaf331e250104c55467" Oct 06 13:44:08 crc kubenswrapper[4867]: E1006 13:44:08.248590 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60c00874ba7e56759100f2ed333549ffdd8bae6eb231eeaf331e250104c55467\": container with ID starting with 60c00874ba7e56759100f2ed333549ffdd8bae6eb231eeaf331e250104c55467 not found: ID does not exist" containerID="60c00874ba7e56759100f2ed333549ffdd8bae6eb231eeaf331e250104c55467" Oct 06 13:44:08 crc kubenswrapper[4867]: I1006 13:44:08.248625 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60c00874ba7e56759100f2ed333549ffdd8bae6eb231eeaf331e250104c55467"} err="failed to get container status \"60c00874ba7e56759100f2ed333549ffdd8bae6eb231eeaf331e250104c55467\": rpc error: code = NotFound desc = could not find container \"60c00874ba7e56759100f2ed333549ffdd8bae6eb231eeaf331e250104c55467\": container with ID starting with 60c00874ba7e56759100f2ed333549ffdd8bae6eb231eeaf331e250104c55467 not found: ID does not exist" Oct 06 13:44:09 crc kubenswrapper[4867]: I1006 13:44:09.221139 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:44:09 crc kubenswrapper[4867]: E1006 13:44:09.221520 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:44:09 crc kubenswrapper[4867]: I1006 13:44:09.232783 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea88b70-18e8-4e29-9ce3-b5e34d66342c" path="/var/lib/kubelet/pods/0ea88b70-18e8-4e29-9ce3-b5e34d66342c/volumes" Oct 06 13:44:23 crc kubenswrapper[4867]: I1006 13:44:23.221225 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:44:23 crc kubenswrapper[4867]: E1006 13:44:23.222015 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:44:36 crc kubenswrapper[4867]: I1006 13:44:36.221380 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:44:36 crc kubenswrapper[4867]: E1006 13:44:36.222785 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:44:51 crc kubenswrapper[4867]: I1006 13:44:51.230214 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:44:51 crc kubenswrapper[4867]: E1006 13:44:51.231647 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:44:55 crc kubenswrapper[4867]: I1006 13:44:55.603944 4867 generic.go:334] "Generic (PLEG): container finished" podID="019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be" containerID="1e663b010530286cfa2c5e362b1eb882a725f691964566403348665b25d85480" exitCode=0 Oct 06 13:44:55 crc kubenswrapper[4867]: I1006 13:44:55.603983 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" event={"ID":"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be","Type":"ContainerDied","Data":"1e663b010530286cfa2c5e362b1eb882a725f691964566403348665b25d85480"} Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.091037 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.152582 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-ssh-key\") pod \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.152630 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-libvirt-secret-0\") pod \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.152685 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-inventory\") pod \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.152818 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-libvirt-combined-ca-bundle\") pod \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.152877 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdkd9\" (UniqueName: \"kubernetes.io/projected/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-kube-api-access-wdkd9\") pod \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\" (UID: \"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be\") " Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.159246 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be" (UID: "019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.161626 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-kube-api-access-wdkd9" (OuterVolumeSpecName: "kube-api-access-wdkd9") pod "019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be" (UID: "019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be"). InnerVolumeSpecName "kube-api-access-wdkd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.186292 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be" (UID: "019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.186967 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be" (UID: "019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.192609 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-inventory" (OuterVolumeSpecName: "inventory") pod "019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be" (UID: "019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.256670 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.256718 4867 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.256733 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.256748 4867 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.256763 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdkd9\" (UniqueName: \"kubernetes.io/projected/019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be-kube-api-access-wdkd9\") on node \"crc\" DevicePath \"\"" Oct 06 13:44:57 crc kubenswrapper[4867]: E1006 13:44:57.432815 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod019fe1d6_0cfb_4b47_93bc_d08b1bd0f4be.slice/crio-4f19a6137c4ff498bc0c223d6529f4e2a2242d3b6335d8b636386844a415b22b\": RecentStats: unable to find data in memory cache]" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.630470 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" event={"ID":"019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be","Type":"ContainerDied","Data":"4f19a6137c4ff498bc0c223d6529f4e2a2242d3b6335d8b636386844a415b22b"} Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.630532 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f19a6137c4ff498bc0c223d6529f4e2a2242d3b6335d8b636386844a415b22b" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.630652 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.762537 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw"] Oct 06 13:44:57 crc kubenswrapper[4867]: E1006 13:44:57.763200 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea88b70-18e8-4e29-9ce3-b5e34d66342c" containerName="extract-content" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.763233 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea88b70-18e8-4e29-9ce3-b5e34d66342c" containerName="extract-content" Oct 06 13:44:57 crc kubenswrapper[4867]: E1006 13:44:57.763243 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.763356 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 13:44:57 crc kubenswrapper[4867]: E1006 13:44:57.763398 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea88b70-18e8-4e29-9ce3-b5e34d66342c" containerName="extract-utilities" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.763409 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea88b70-18e8-4e29-9ce3-b5e34d66342c" containerName="extract-utilities" Oct 06 13:44:57 crc kubenswrapper[4867]: E1006 13:44:57.763423 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea88b70-18e8-4e29-9ce3-b5e34d66342c" containerName="registry-server" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.763431 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea88b70-18e8-4e29-9ce3-b5e34d66342c" containerName="registry-server" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.763663 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.763690 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea88b70-18e8-4e29-9ce3-b5e34d66342c" containerName="registry-server" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.764710 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.769205 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.769562 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.769737 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.770779 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.770823 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.771759 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw"] Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.772784 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.772842 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.779861 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.780045 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.780092 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.780207 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.780545 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.780753 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.780931 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5h44\" (UniqueName: \"kubernetes.io/projected/33622851-83c0-48c9-969d-99f96fbcb64f-kube-api-access-p5h44\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.781125 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/33622851-83c0-48c9-969d-99f96fbcb64f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.781403 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.882850 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.882923 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.882993 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.883018 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.883063 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.883145 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.883179 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.883202 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5h44\" (UniqueName: \"kubernetes.io/projected/33622851-83c0-48c9-969d-99f96fbcb64f-kube-api-access-p5h44\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.883240 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/33622851-83c0-48c9-969d-99f96fbcb64f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.884479 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/33622851-83c0-48c9-969d-99f96fbcb64f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.888881 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.889112 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.889150 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.889491 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.890189 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.890571 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.894306 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:57 crc kubenswrapper[4867]: I1006 13:44:57.904037 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5h44\" (UniqueName: \"kubernetes.io/projected/33622851-83c0-48c9-969d-99f96fbcb64f-kube-api-access-p5h44\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jgjsw\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:58 crc kubenswrapper[4867]: I1006 13:44:58.084551 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:44:58 crc kubenswrapper[4867]: I1006 13:44:58.645548 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw"] Oct 06 13:44:59 crc kubenswrapper[4867]: I1006 13:44:59.656440 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" event={"ID":"33622851-83c0-48c9-969d-99f96fbcb64f","Type":"ContainerStarted","Data":"3ce163a8041983f18765a04933a2c2e09bb18bd2182cacb3fe6cbc4ac82496d5"} Oct 06 13:44:59 crc kubenswrapper[4867]: I1006 13:44:59.657177 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" event={"ID":"33622851-83c0-48c9-969d-99f96fbcb64f","Type":"ContainerStarted","Data":"d7926cb967c7950471bf40bf289f878699f8f8746374f59ab781542067c1d853"} Oct 06 13:44:59 crc kubenswrapper[4867]: I1006 13:44:59.678347 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" podStartSLOduration=2.081544273 podStartE2EDuration="2.678314325s" podCreationTimestamp="2025-10-06 13:44:57 +0000 UTC" firstStartedPulling="2025-10-06 13:44:58.65153216 +0000 UTC m=+2478.109480304" lastFinishedPulling="2025-10-06 13:44:59.248302212 +0000 UTC m=+2478.706250356" observedRunningTime="2025-10-06 13:44:59.677160254 +0000 UTC m=+2479.135108398" watchObservedRunningTime="2025-10-06 13:44:59.678314325 +0000 UTC m=+2479.136262469" Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.149226 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj"] Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.152735 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.162963 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj"] Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.192038 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.192507 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.248484 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgcnq\" (UniqueName: \"kubernetes.io/projected/2d873c69-5229-4308-ac4a-e6e83a067a42-kube-api-access-xgcnq\") pod \"collect-profiles-29329305-rrfmj\" (UID: \"2d873c69-5229-4308-ac4a-e6e83a067a42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.248559 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d873c69-5229-4308-ac4a-e6e83a067a42-config-volume\") pod \"collect-profiles-29329305-rrfmj\" (UID: \"2d873c69-5229-4308-ac4a-e6e83a067a42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.248634 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d873c69-5229-4308-ac4a-e6e83a067a42-secret-volume\") pod \"collect-profiles-29329305-rrfmj\" (UID: \"2d873c69-5229-4308-ac4a-e6e83a067a42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.350190 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d873c69-5229-4308-ac4a-e6e83a067a42-config-volume\") pod \"collect-profiles-29329305-rrfmj\" (UID: \"2d873c69-5229-4308-ac4a-e6e83a067a42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.350317 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d873c69-5229-4308-ac4a-e6e83a067a42-secret-volume\") pod \"collect-profiles-29329305-rrfmj\" (UID: \"2d873c69-5229-4308-ac4a-e6e83a067a42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.350526 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgcnq\" (UniqueName: \"kubernetes.io/projected/2d873c69-5229-4308-ac4a-e6e83a067a42-kube-api-access-xgcnq\") pod \"collect-profiles-29329305-rrfmj\" (UID: \"2d873c69-5229-4308-ac4a-e6e83a067a42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.351429 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d873c69-5229-4308-ac4a-e6e83a067a42-config-volume\") pod \"collect-profiles-29329305-rrfmj\" (UID: \"2d873c69-5229-4308-ac4a-e6e83a067a42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.370070 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d873c69-5229-4308-ac4a-e6e83a067a42-secret-volume\") pod \"collect-profiles-29329305-rrfmj\" (UID: \"2d873c69-5229-4308-ac4a-e6e83a067a42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.373819 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgcnq\" (UniqueName: \"kubernetes.io/projected/2d873c69-5229-4308-ac4a-e6e83a067a42-kube-api-access-xgcnq\") pod \"collect-profiles-29329305-rrfmj\" (UID: \"2d873c69-5229-4308-ac4a-e6e83a067a42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" Oct 06 13:45:00 crc kubenswrapper[4867]: I1006 13:45:00.517149 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" Oct 06 13:45:01 crc kubenswrapper[4867]: I1006 13:45:01.041991 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj"] Oct 06 13:45:01 crc kubenswrapper[4867]: I1006 13:45:01.682589 4867 generic.go:334] "Generic (PLEG): container finished" podID="2d873c69-5229-4308-ac4a-e6e83a067a42" containerID="1e435bd000c504f249c257cde5c809373df810cbc73b285f2c70475395899636" exitCode=0 Oct 06 13:45:01 crc kubenswrapper[4867]: I1006 13:45:01.682651 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" event={"ID":"2d873c69-5229-4308-ac4a-e6e83a067a42","Type":"ContainerDied","Data":"1e435bd000c504f249c257cde5c809373df810cbc73b285f2c70475395899636"} Oct 06 13:45:01 crc kubenswrapper[4867]: I1006 13:45:01.683075 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" event={"ID":"2d873c69-5229-4308-ac4a-e6e83a067a42","Type":"ContainerStarted","Data":"d05a647615aadb00e5bbe73939d0a9bffd097cc8edf7fe45dce82959610ed76b"} Oct 06 13:45:03 crc kubenswrapper[4867]: I1006 13:45:03.127677 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" Oct 06 13:45:03 crc kubenswrapper[4867]: I1006 13:45:03.320128 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d873c69-5229-4308-ac4a-e6e83a067a42-secret-volume\") pod \"2d873c69-5229-4308-ac4a-e6e83a067a42\" (UID: \"2d873c69-5229-4308-ac4a-e6e83a067a42\") " Oct 06 13:45:03 crc kubenswrapper[4867]: I1006 13:45:03.320605 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d873c69-5229-4308-ac4a-e6e83a067a42-config-volume\") pod \"2d873c69-5229-4308-ac4a-e6e83a067a42\" (UID: \"2d873c69-5229-4308-ac4a-e6e83a067a42\") " Oct 06 13:45:03 crc kubenswrapper[4867]: I1006 13:45:03.320657 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgcnq\" (UniqueName: \"kubernetes.io/projected/2d873c69-5229-4308-ac4a-e6e83a067a42-kube-api-access-xgcnq\") pod \"2d873c69-5229-4308-ac4a-e6e83a067a42\" (UID: \"2d873c69-5229-4308-ac4a-e6e83a067a42\") " Oct 06 13:45:03 crc kubenswrapper[4867]: I1006 13:45:03.321943 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d873c69-5229-4308-ac4a-e6e83a067a42-config-volume" (OuterVolumeSpecName: "config-volume") pod "2d873c69-5229-4308-ac4a-e6e83a067a42" (UID: "2d873c69-5229-4308-ac4a-e6e83a067a42"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:45:03 crc kubenswrapper[4867]: I1006 13:45:03.329212 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d873c69-5229-4308-ac4a-e6e83a067a42-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2d873c69-5229-4308-ac4a-e6e83a067a42" (UID: "2d873c69-5229-4308-ac4a-e6e83a067a42"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:45:03 crc kubenswrapper[4867]: I1006 13:45:03.333279 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d873c69-5229-4308-ac4a-e6e83a067a42-kube-api-access-xgcnq" (OuterVolumeSpecName: "kube-api-access-xgcnq") pod "2d873c69-5229-4308-ac4a-e6e83a067a42" (UID: "2d873c69-5229-4308-ac4a-e6e83a067a42"). InnerVolumeSpecName "kube-api-access-xgcnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:45:03 crc kubenswrapper[4867]: I1006 13:45:03.423747 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2d873c69-5229-4308-ac4a-e6e83a067a42-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:45:03 crc kubenswrapper[4867]: I1006 13:45:03.424156 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2d873c69-5229-4308-ac4a-e6e83a067a42-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:45:03 crc kubenswrapper[4867]: I1006 13:45:03.424204 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgcnq\" (UniqueName: \"kubernetes.io/projected/2d873c69-5229-4308-ac4a-e6e83a067a42-kube-api-access-xgcnq\") on node \"crc\" DevicePath \"\"" Oct 06 13:45:03 crc kubenswrapper[4867]: I1006 13:45:03.709193 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" event={"ID":"2d873c69-5229-4308-ac4a-e6e83a067a42","Type":"ContainerDied","Data":"d05a647615aadb00e5bbe73939d0a9bffd097cc8edf7fe45dce82959610ed76b"} Oct 06 13:45:03 crc kubenswrapper[4867]: I1006 13:45:03.709281 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d05a647615aadb00e5bbe73939d0a9bffd097cc8edf7fe45dce82959610ed76b" Oct 06 13:45:03 crc kubenswrapper[4867]: I1006 13:45:03.709380 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj" Oct 06 13:45:04 crc kubenswrapper[4867]: I1006 13:45:04.222396 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:45:04 crc kubenswrapper[4867]: E1006 13:45:04.222941 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:45:04 crc kubenswrapper[4867]: I1006 13:45:04.227270 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc"] Oct 06 13:45:04 crc kubenswrapper[4867]: I1006 13:45:04.241734 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329260-qlqqc"] Oct 06 13:45:05 crc kubenswrapper[4867]: I1006 13:45:05.253224 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598298ab-238a-4776-9e23-e66c273dc805" path="/var/lib/kubelet/pods/598298ab-238a-4776-9e23-e66c273dc805/volumes" Oct 06 13:45:19 crc kubenswrapper[4867]: I1006 13:45:19.222064 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:45:19 crc kubenswrapper[4867]: E1006 13:45:19.223418 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:45:30 crc kubenswrapper[4867]: I1006 13:45:30.221610 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:45:30 crc kubenswrapper[4867]: E1006 13:45:30.223452 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:45:41 crc kubenswrapper[4867]: I1006 13:45:41.244255 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:45:41 crc kubenswrapper[4867]: E1006 13:45:41.245228 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:45:56 crc kubenswrapper[4867]: I1006 13:45:56.221098 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:45:56 crc kubenswrapper[4867]: E1006 13:45:56.221848 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:46:02 crc kubenswrapper[4867]: I1006 13:46:02.642123 4867 scope.go:117] "RemoveContainer" containerID="c72ea94e655d936e618651192f8799ebba998eb0cb6590bad3bc3db720987234" Oct 06 13:46:07 crc kubenswrapper[4867]: I1006 13:46:07.222437 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:46:07 crc kubenswrapper[4867]: E1006 13:46:07.224213 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:46:20 crc kubenswrapper[4867]: I1006 13:46:20.221884 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:46:20 crc kubenswrapper[4867]: E1006 13:46:20.222738 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:46:33 crc kubenswrapper[4867]: I1006 13:46:33.222578 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:46:33 crc kubenswrapper[4867]: E1006 13:46:33.224062 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:46:48 crc kubenswrapper[4867]: I1006 13:46:48.221367 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:46:48 crc kubenswrapper[4867]: E1006 13:46:48.222047 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:47:02 crc kubenswrapper[4867]: I1006 13:47:02.221424 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:47:02 crc kubenswrapper[4867]: E1006 13:47:02.222833 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:47:16 crc kubenswrapper[4867]: I1006 13:47:16.221144 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:47:16 crc kubenswrapper[4867]: E1006 13:47:16.221895 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:47:31 crc kubenswrapper[4867]: I1006 13:47:31.230364 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:47:31 crc kubenswrapper[4867]: E1006 13:47:31.231610 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:47:45 crc kubenswrapper[4867]: I1006 13:47:45.222022 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:47:45 crc kubenswrapper[4867]: E1006 13:47:45.223109 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:47:59 crc kubenswrapper[4867]: I1006 13:47:59.222453 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:47:59 crc kubenswrapper[4867]: E1006 13:47:59.223862 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:48:14 crc kubenswrapper[4867]: I1006 13:48:14.221828 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:48:14 crc kubenswrapper[4867]: E1006 13:48:14.236743 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:48:27 crc kubenswrapper[4867]: I1006 13:48:27.223853 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:48:27 crc kubenswrapper[4867]: E1006 13:48:27.224942 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:48:34 crc kubenswrapper[4867]: I1006 13:48:34.095118 4867 generic.go:334] "Generic (PLEG): container finished" podID="33622851-83c0-48c9-969d-99f96fbcb64f" containerID="3ce163a8041983f18765a04933a2c2e09bb18bd2182cacb3fe6cbc4ac82496d5" exitCode=0 Oct 06 13:48:34 crc kubenswrapper[4867]: I1006 13:48:34.095335 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" event={"ID":"33622851-83c0-48c9-969d-99f96fbcb64f","Type":"ContainerDied","Data":"3ce163a8041983f18765a04933a2c2e09bb18bd2182cacb3fe6cbc4ac82496d5"} Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.591306 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.634308 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-cell1-compute-config-0\") pod \"33622851-83c0-48c9-969d-99f96fbcb64f\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.634420 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-migration-ssh-key-1\") pod \"33622851-83c0-48c9-969d-99f96fbcb64f\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.634449 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/33622851-83c0-48c9-969d-99f96fbcb64f-nova-extra-config-0\") pod \"33622851-83c0-48c9-969d-99f96fbcb64f\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.634589 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-inventory\") pod \"33622851-83c0-48c9-969d-99f96fbcb64f\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.634682 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5h44\" (UniqueName: \"kubernetes.io/projected/33622851-83c0-48c9-969d-99f96fbcb64f-kube-api-access-p5h44\") pod \"33622851-83c0-48c9-969d-99f96fbcb64f\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.634720 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-ssh-key\") pod \"33622851-83c0-48c9-969d-99f96fbcb64f\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.634769 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-cell1-compute-config-1\") pod \"33622851-83c0-48c9-969d-99f96fbcb64f\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.634861 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-combined-ca-bundle\") pod \"33622851-83c0-48c9-969d-99f96fbcb64f\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.634969 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-migration-ssh-key-0\") pod \"33622851-83c0-48c9-969d-99f96fbcb64f\" (UID: \"33622851-83c0-48c9-969d-99f96fbcb64f\") " Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.660465 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "33622851-83c0-48c9-969d-99f96fbcb64f" (UID: "33622851-83c0-48c9-969d-99f96fbcb64f"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.660542 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33622851-83c0-48c9-969d-99f96fbcb64f-kube-api-access-p5h44" (OuterVolumeSpecName: "kube-api-access-p5h44") pod "33622851-83c0-48c9-969d-99f96fbcb64f" (UID: "33622851-83c0-48c9-969d-99f96fbcb64f"). InnerVolumeSpecName "kube-api-access-p5h44". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.675276 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "33622851-83c0-48c9-969d-99f96fbcb64f" (UID: "33622851-83c0-48c9-969d-99f96fbcb64f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.679508 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "33622851-83c0-48c9-969d-99f96fbcb64f" (UID: "33622851-83c0-48c9-969d-99f96fbcb64f"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.679554 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "33622851-83c0-48c9-969d-99f96fbcb64f" (UID: "33622851-83c0-48c9-969d-99f96fbcb64f"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.682941 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "33622851-83c0-48c9-969d-99f96fbcb64f" (UID: "33622851-83c0-48c9-969d-99f96fbcb64f"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.684512 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "33622851-83c0-48c9-969d-99f96fbcb64f" (UID: "33622851-83c0-48c9-969d-99f96fbcb64f"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.699000 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33622851-83c0-48c9-969d-99f96fbcb64f-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "33622851-83c0-48c9-969d-99f96fbcb64f" (UID: "33622851-83c0-48c9-969d-99f96fbcb64f"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.717190 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-inventory" (OuterVolumeSpecName: "inventory") pod "33622851-83c0-48c9-969d-99f96fbcb64f" (UID: "33622851-83c0-48c9-969d-99f96fbcb64f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.738523 4867 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.738564 4867 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.738575 4867 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.738587 4867 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.738596 4867 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/33622851-83c0-48c9-969d-99f96fbcb64f-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.738607 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.738616 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5h44\" (UniqueName: \"kubernetes.io/projected/33622851-83c0-48c9-969d-99f96fbcb64f-kube-api-access-p5h44\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.738625 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:35 crc kubenswrapper[4867]: I1006 13:48:35.738634 4867 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/33622851-83c0-48c9-969d-99f96fbcb64f-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.116705 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" event={"ID":"33622851-83c0-48c9-969d-99f96fbcb64f","Type":"ContainerDied","Data":"d7926cb967c7950471bf40bf289f878699f8f8746374f59ab781542067c1d853"} Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.117019 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7926cb967c7950471bf40bf289f878699f8f8746374f59ab781542067c1d853" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.116758 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jgjsw" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.211016 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t"] Oct 06 13:48:36 crc kubenswrapper[4867]: E1006 13:48:36.211791 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d873c69-5229-4308-ac4a-e6e83a067a42" containerName="collect-profiles" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.211899 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d873c69-5229-4308-ac4a-e6e83a067a42" containerName="collect-profiles" Oct 06 13:48:36 crc kubenswrapper[4867]: E1006 13:48:36.212021 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33622851-83c0-48c9-969d-99f96fbcb64f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.212095 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="33622851-83c0-48c9-969d-99f96fbcb64f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.212479 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d873c69-5229-4308-ac4a-e6e83a067a42" containerName="collect-profiles" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.212630 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="33622851-83c0-48c9-969d-99f96fbcb64f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.213699 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.216025 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.216365 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.218753 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rcppz" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.221555 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.221775 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.222892 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t"] Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.351072 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.351157 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnk5b\" (UniqueName: \"kubernetes.io/projected/a13c4977-6a03-4678-b394-0b33d74ee2a8-kube-api-access-mnk5b\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.351223 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.351280 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.351400 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.351479 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.351629 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.453644 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.453707 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.453741 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.453766 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.453832 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.453878 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.453920 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnk5b\" (UniqueName: \"kubernetes.io/projected/a13c4977-6a03-4678-b394-0b33d74ee2a8-kube-api-access-mnk5b\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.457937 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.458142 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.458886 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.458925 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.461881 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.462156 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.469297 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnk5b\" (UniqueName: \"kubernetes.io/projected/a13c4977-6a03-4678-b394-0b33d74ee2a8-kube-api-access-mnk5b\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.532929 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.693646 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nh5g7"] Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.703469 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.715832 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nh5g7"] Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.866658 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5znk7\" (UniqueName: \"kubernetes.io/projected/832ad292-cee7-4c56-89d4-edc633b608fd-kube-api-access-5znk7\") pod \"certified-operators-nh5g7\" (UID: \"832ad292-cee7-4c56-89d4-edc633b608fd\") " pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.866848 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/832ad292-cee7-4c56-89d4-edc633b608fd-catalog-content\") pod \"certified-operators-nh5g7\" (UID: \"832ad292-cee7-4c56-89d4-edc633b608fd\") " pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.866935 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/832ad292-cee7-4c56-89d4-edc633b608fd-utilities\") pod \"certified-operators-nh5g7\" (UID: \"832ad292-cee7-4c56-89d4-edc633b608fd\") " pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.969091 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5znk7\" (UniqueName: \"kubernetes.io/projected/832ad292-cee7-4c56-89d4-edc633b608fd-kube-api-access-5znk7\") pod \"certified-operators-nh5g7\" (UID: \"832ad292-cee7-4c56-89d4-edc633b608fd\") " pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.969218 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/832ad292-cee7-4c56-89d4-edc633b608fd-catalog-content\") pod \"certified-operators-nh5g7\" (UID: \"832ad292-cee7-4c56-89d4-edc633b608fd\") " pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.969265 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/832ad292-cee7-4c56-89d4-edc633b608fd-utilities\") pod \"certified-operators-nh5g7\" (UID: \"832ad292-cee7-4c56-89d4-edc633b608fd\") " pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.969868 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/832ad292-cee7-4c56-89d4-edc633b608fd-catalog-content\") pod \"certified-operators-nh5g7\" (UID: \"832ad292-cee7-4c56-89d4-edc633b608fd\") " pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.969906 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/832ad292-cee7-4c56-89d4-edc633b608fd-utilities\") pod \"certified-operators-nh5g7\" (UID: \"832ad292-cee7-4c56-89d4-edc633b608fd\") " pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:36 crc kubenswrapper[4867]: I1006 13:48:36.988964 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5znk7\" (UniqueName: \"kubernetes.io/projected/832ad292-cee7-4c56-89d4-edc633b608fd-kube-api-access-5znk7\") pod \"certified-operators-nh5g7\" (UID: \"832ad292-cee7-4c56-89d4-edc633b608fd\") " pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:37 crc kubenswrapper[4867]: I1006 13:48:37.035139 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:37 crc kubenswrapper[4867]: I1006 13:48:37.140125 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t"] Oct 06 13:48:37 crc kubenswrapper[4867]: I1006 13:48:37.567500 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nh5g7"] Oct 06 13:48:37 crc kubenswrapper[4867]: W1006 13:48:37.568826 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod832ad292_cee7_4c56_89d4_edc633b608fd.slice/crio-f017f2cd2ad98bcbf4e1742b002ca597a67b40b094bb8c2f4620a5cf988f6168 WatchSource:0}: Error finding container f017f2cd2ad98bcbf4e1742b002ca597a67b40b094bb8c2f4620a5cf988f6168: Status 404 returned error can't find the container with id f017f2cd2ad98bcbf4e1742b002ca597a67b40b094bb8c2f4620a5cf988f6168 Oct 06 13:48:38 crc kubenswrapper[4867]: I1006 13:48:38.139796 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" event={"ID":"a13c4977-6a03-4678-b394-0b33d74ee2a8","Type":"ContainerStarted","Data":"213030e53c3b158eba338c3e14aa415f762e2eec5d20529a0f4c153ab214dbff"} Oct 06 13:48:38 crc kubenswrapper[4867]: I1006 13:48:38.142425 4867 generic.go:334] "Generic (PLEG): container finished" podID="832ad292-cee7-4c56-89d4-edc633b608fd" containerID="a97656b0298f092eefcf4621137ad1e31793211da442c8650518c230267669b3" exitCode=0 Oct 06 13:48:38 crc kubenswrapper[4867]: I1006 13:48:38.142472 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nh5g7" event={"ID":"832ad292-cee7-4c56-89d4-edc633b608fd","Type":"ContainerDied","Data":"a97656b0298f092eefcf4621137ad1e31793211da442c8650518c230267669b3"} Oct 06 13:48:38 crc kubenswrapper[4867]: I1006 13:48:38.142506 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nh5g7" event={"ID":"832ad292-cee7-4c56-89d4-edc633b608fd","Type":"ContainerStarted","Data":"f017f2cd2ad98bcbf4e1742b002ca597a67b40b094bb8c2f4620a5cf988f6168"} Oct 06 13:48:39 crc kubenswrapper[4867]: I1006 13:48:39.155601 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" event={"ID":"a13c4977-6a03-4678-b394-0b33d74ee2a8","Type":"ContainerStarted","Data":"c5d3b3a5c3467b5f4f159579b71edb66b8fb4308791b9b5efe53d7da2f6257a9"} Oct 06 13:48:39 crc kubenswrapper[4867]: I1006 13:48:39.188359 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" podStartSLOduration=2.353804014 podStartE2EDuration="3.188341437s" podCreationTimestamp="2025-10-06 13:48:36 +0000 UTC" firstStartedPulling="2025-10-06 13:48:37.149462807 +0000 UTC m=+2696.607410961" lastFinishedPulling="2025-10-06 13:48:37.98400025 +0000 UTC m=+2697.441948384" observedRunningTime="2025-10-06 13:48:39.180151254 +0000 UTC m=+2698.638099398" watchObservedRunningTime="2025-10-06 13:48:39.188341437 +0000 UTC m=+2698.646289581" Oct 06 13:48:40 crc kubenswrapper[4867]: I1006 13:48:40.167172 4867 generic.go:334] "Generic (PLEG): container finished" podID="832ad292-cee7-4c56-89d4-edc633b608fd" containerID="fe6eb4c37d26c4ad3b35a8419f80b22225f110c32e2b0206f64b7118b44cbb1e" exitCode=0 Oct 06 13:48:40 crc kubenswrapper[4867]: I1006 13:48:40.167282 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nh5g7" event={"ID":"832ad292-cee7-4c56-89d4-edc633b608fd","Type":"ContainerDied","Data":"fe6eb4c37d26c4ad3b35a8419f80b22225f110c32e2b0206f64b7118b44cbb1e"} Oct 06 13:48:41 crc kubenswrapper[4867]: I1006 13:48:41.179015 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nh5g7" event={"ID":"832ad292-cee7-4c56-89d4-edc633b608fd","Type":"ContainerStarted","Data":"ebe9c5060a2f0e1ca1d0fc65bf07c9f7b1bdfd6ab929c963872057a397e7a980"} Oct 06 13:48:41 crc kubenswrapper[4867]: I1006 13:48:41.204803 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nh5g7" podStartSLOduration=2.55387182 podStartE2EDuration="5.204785837s" podCreationTimestamp="2025-10-06 13:48:36 +0000 UTC" firstStartedPulling="2025-10-06 13:48:38.146493093 +0000 UTC m=+2697.604441237" lastFinishedPulling="2025-10-06 13:48:40.79740711 +0000 UTC m=+2700.255355254" observedRunningTime="2025-10-06 13:48:41.203368908 +0000 UTC m=+2700.661317052" watchObservedRunningTime="2025-10-06 13:48:41.204785837 +0000 UTC m=+2700.662733981" Oct 06 13:48:41 crc kubenswrapper[4867]: I1006 13:48:41.228223 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:48:41 crc kubenswrapper[4867]: E1006 13:48:41.228573 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:48:43 crc kubenswrapper[4867]: I1006 13:48:43.720552 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p9mkz"] Oct 06 13:48:43 crc kubenswrapper[4867]: I1006 13:48:43.724461 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:48:43 crc kubenswrapper[4867]: I1006 13:48:43.752079 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p9mkz"] Oct 06 13:48:43 crc kubenswrapper[4867]: I1006 13:48:43.817486 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlcbb\" (UniqueName: \"kubernetes.io/projected/cf28b048-e55f-450e-962e-dfd469b7e9a7-kube-api-access-nlcbb\") pod \"redhat-operators-p9mkz\" (UID: \"cf28b048-e55f-450e-962e-dfd469b7e9a7\") " pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:48:43 crc kubenswrapper[4867]: I1006 13:48:43.817610 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf28b048-e55f-450e-962e-dfd469b7e9a7-catalog-content\") pod \"redhat-operators-p9mkz\" (UID: \"cf28b048-e55f-450e-962e-dfd469b7e9a7\") " pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:48:43 crc kubenswrapper[4867]: I1006 13:48:43.817648 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf28b048-e55f-450e-962e-dfd469b7e9a7-utilities\") pod \"redhat-operators-p9mkz\" (UID: \"cf28b048-e55f-450e-962e-dfd469b7e9a7\") " pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:48:43 crc kubenswrapper[4867]: I1006 13:48:43.919847 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlcbb\" (UniqueName: \"kubernetes.io/projected/cf28b048-e55f-450e-962e-dfd469b7e9a7-kube-api-access-nlcbb\") pod \"redhat-operators-p9mkz\" (UID: \"cf28b048-e55f-450e-962e-dfd469b7e9a7\") " pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:48:43 crc kubenswrapper[4867]: I1006 13:48:43.919982 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf28b048-e55f-450e-962e-dfd469b7e9a7-catalog-content\") pod \"redhat-operators-p9mkz\" (UID: \"cf28b048-e55f-450e-962e-dfd469b7e9a7\") " pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:48:43 crc kubenswrapper[4867]: I1006 13:48:43.920022 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf28b048-e55f-450e-962e-dfd469b7e9a7-utilities\") pod \"redhat-operators-p9mkz\" (UID: \"cf28b048-e55f-450e-962e-dfd469b7e9a7\") " pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:48:43 crc kubenswrapper[4867]: I1006 13:48:43.920698 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf28b048-e55f-450e-962e-dfd469b7e9a7-utilities\") pod \"redhat-operators-p9mkz\" (UID: \"cf28b048-e55f-450e-962e-dfd469b7e9a7\") " pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:48:43 crc kubenswrapper[4867]: I1006 13:48:43.920714 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf28b048-e55f-450e-962e-dfd469b7e9a7-catalog-content\") pod \"redhat-operators-p9mkz\" (UID: \"cf28b048-e55f-450e-962e-dfd469b7e9a7\") " pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:48:43 crc kubenswrapper[4867]: I1006 13:48:43.948138 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlcbb\" (UniqueName: \"kubernetes.io/projected/cf28b048-e55f-450e-962e-dfd469b7e9a7-kube-api-access-nlcbb\") pod \"redhat-operators-p9mkz\" (UID: \"cf28b048-e55f-450e-962e-dfd469b7e9a7\") " pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:48:44 crc kubenswrapper[4867]: I1006 13:48:44.055772 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:48:44 crc kubenswrapper[4867]: I1006 13:48:44.528551 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p9mkz"] Oct 06 13:48:44 crc kubenswrapper[4867]: W1006 13:48:44.531881 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf28b048_e55f_450e_962e_dfd469b7e9a7.slice/crio-87de1e8219cc4642de4c93148096b62741b81dd11c92f77b6f53a72c75b92bcb WatchSource:0}: Error finding container 87de1e8219cc4642de4c93148096b62741b81dd11c92f77b6f53a72c75b92bcb: Status 404 returned error can't find the container with id 87de1e8219cc4642de4c93148096b62741b81dd11c92f77b6f53a72c75b92bcb Oct 06 13:48:45 crc kubenswrapper[4867]: I1006 13:48:45.261892 4867 generic.go:334] "Generic (PLEG): container finished" podID="cf28b048-e55f-450e-962e-dfd469b7e9a7" containerID="dc8c3e466142c9d84b36b386d05cd309dabbd3283f7876b512939b6b79dba1c4" exitCode=0 Oct 06 13:48:45 crc kubenswrapper[4867]: I1006 13:48:45.261933 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p9mkz" event={"ID":"cf28b048-e55f-450e-962e-dfd469b7e9a7","Type":"ContainerDied","Data":"dc8c3e466142c9d84b36b386d05cd309dabbd3283f7876b512939b6b79dba1c4"} Oct 06 13:48:45 crc kubenswrapper[4867]: I1006 13:48:45.261961 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p9mkz" event={"ID":"cf28b048-e55f-450e-962e-dfd469b7e9a7","Type":"ContainerStarted","Data":"87de1e8219cc4642de4c93148096b62741b81dd11c92f77b6f53a72c75b92bcb"} Oct 06 13:48:47 crc kubenswrapper[4867]: I1006 13:48:47.035721 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:47 crc kubenswrapper[4867]: I1006 13:48:47.036336 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:47 crc kubenswrapper[4867]: I1006 13:48:47.083689 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:47 crc kubenswrapper[4867]: I1006 13:48:47.289560 4867 generic.go:334] "Generic (PLEG): container finished" podID="cf28b048-e55f-450e-962e-dfd469b7e9a7" containerID="ec82222c5f321d224c9740a504d03fc4468f0f748220c3be6a0370f448b292c9" exitCode=0 Oct 06 13:48:47 crc kubenswrapper[4867]: I1006 13:48:47.289678 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p9mkz" event={"ID":"cf28b048-e55f-450e-962e-dfd469b7e9a7","Type":"ContainerDied","Data":"ec82222c5f321d224c9740a504d03fc4468f0f748220c3be6a0370f448b292c9"} Oct 06 13:48:47 crc kubenswrapper[4867]: I1006 13:48:47.371191 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:49 crc kubenswrapper[4867]: I1006 13:48:49.063019 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nh5g7"] Oct 06 13:48:49 crc kubenswrapper[4867]: I1006 13:48:49.308678 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nh5g7" podUID="832ad292-cee7-4c56-89d4-edc633b608fd" containerName="registry-server" containerID="cri-o://ebe9c5060a2f0e1ca1d0fc65bf07c9f7b1bdfd6ab929c963872057a397e7a980" gracePeriod=2 Oct 06 13:48:52 crc kubenswrapper[4867]: I1006 13:48:52.349525 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p9mkz" event={"ID":"cf28b048-e55f-450e-962e-dfd469b7e9a7","Type":"ContainerStarted","Data":"f20e2686e34426ff6239eb0845cfe2c01bffbe152fe005b91fa25fc76dd194a9"} Oct 06 13:48:52 crc kubenswrapper[4867]: I1006 13:48:52.364397 4867 generic.go:334] "Generic (PLEG): container finished" podID="832ad292-cee7-4c56-89d4-edc633b608fd" containerID="ebe9c5060a2f0e1ca1d0fc65bf07c9f7b1bdfd6ab929c963872057a397e7a980" exitCode=0 Oct 06 13:48:52 crc kubenswrapper[4867]: I1006 13:48:52.364473 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nh5g7" event={"ID":"832ad292-cee7-4c56-89d4-edc633b608fd","Type":"ContainerDied","Data":"ebe9c5060a2f0e1ca1d0fc65bf07c9f7b1bdfd6ab929c963872057a397e7a980"} Oct 06 13:48:52 crc kubenswrapper[4867]: I1006 13:48:52.376554 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p9mkz" podStartSLOduration=3.024020508 podStartE2EDuration="9.376528356s" podCreationTimestamp="2025-10-06 13:48:43 +0000 UTC" firstStartedPulling="2025-10-06 13:48:45.26611217 +0000 UTC m=+2704.724060324" lastFinishedPulling="2025-10-06 13:48:51.618620028 +0000 UTC m=+2711.076568172" observedRunningTime="2025-10-06 13:48:52.373207616 +0000 UTC m=+2711.831155760" watchObservedRunningTime="2025-10-06 13:48:52.376528356 +0000 UTC m=+2711.834476510" Oct 06 13:48:52 crc kubenswrapper[4867]: I1006 13:48:52.630895 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:52 crc kubenswrapper[4867]: I1006 13:48:52.709000 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/832ad292-cee7-4c56-89d4-edc633b608fd-utilities\") pod \"832ad292-cee7-4c56-89d4-edc633b608fd\" (UID: \"832ad292-cee7-4c56-89d4-edc633b608fd\") " Oct 06 13:48:52 crc kubenswrapper[4867]: I1006 13:48:52.709056 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5znk7\" (UniqueName: \"kubernetes.io/projected/832ad292-cee7-4c56-89d4-edc633b608fd-kube-api-access-5znk7\") pod \"832ad292-cee7-4c56-89d4-edc633b608fd\" (UID: \"832ad292-cee7-4c56-89d4-edc633b608fd\") " Oct 06 13:48:52 crc kubenswrapper[4867]: I1006 13:48:52.709227 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/832ad292-cee7-4c56-89d4-edc633b608fd-catalog-content\") pod \"832ad292-cee7-4c56-89d4-edc633b608fd\" (UID: \"832ad292-cee7-4c56-89d4-edc633b608fd\") " Oct 06 13:48:52 crc kubenswrapper[4867]: I1006 13:48:52.709742 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/832ad292-cee7-4c56-89d4-edc633b608fd-utilities" (OuterVolumeSpecName: "utilities") pod "832ad292-cee7-4c56-89d4-edc633b608fd" (UID: "832ad292-cee7-4c56-89d4-edc633b608fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:48:52 crc kubenswrapper[4867]: I1006 13:48:52.710546 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/832ad292-cee7-4c56-89d4-edc633b608fd-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:52 crc kubenswrapper[4867]: I1006 13:48:52.720472 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/832ad292-cee7-4c56-89d4-edc633b608fd-kube-api-access-5znk7" (OuterVolumeSpecName: "kube-api-access-5znk7") pod "832ad292-cee7-4c56-89d4-edc633b608fd" (UID: "832ad292-cee7-4c56-89d4-edc633b608fd"). InnerVolumeSpecName "kube-api-access-5znk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:48:52 crc kubenswrapper[4867]: I1006 13:48:52.766569 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/832ad292-cee7-4c56-89d4-edc633b608fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "832ad292-cee7-4c56-89d4-edc633b608fd" (UID: "832ad292-cee7-4c56-89d4-edc633b608fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:48:52 crc kubenswrapper[4867]: I1006 13:48:52.812104 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5znk7\" (UniqueName: \"kubernetes.io/projected/832ad292-cee7-4c56-89d4-edc633b608fd-kube-api-access-5znk7\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:52 crc kubenswrapper[4867]: I1006 13:48:52.812275 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/832ad292-cee7-4c56-89d4-edc633b608fd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:53 crc kubenswrapper[4867]: I1006 13:48:53.376718 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nh5g7" event={"ID":"832ad292-cee7-4c56-89d4-edc633b608fd","Type":"ContainerDied","Data":"f017f2cd2ad98bcbf4e1742b002ca597a67b40b094bb8c2f4620a5cf988f6168"} Oct 06 13:48:53 crc kubenswrapper[4867]: I1006 13:48:53.376812 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nh5g7" Oct 06 13:48:53 crc kubenswrapper[4867]: I1006 13:48:53.377041 4867 scope.go:117] "RemoveContainer" containerID="ebe9c5060a2f0e1ca1d0fc65bf07c9f7b1bdfd6ab929c963872057a397e7a980" Oct 06 13:48:53 crc kubenswrapper[4867]: I1006 13:48:53.418654 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nh5g7"] Oct 06 13:48:53 crc kubenswrapper[4867]: I1006 13:48:53.419447 4867 scope.go:117] "RemoveContainer" containerID="fe6eb4c37d26c4ad3b35a8419f80b22225f110c32e2b0206f64b7118b44cbb1e" Oct 06 13:48:53 crc kubenswrapper[4867]: I1006 13:48:53.432166 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nh5g7"] Oct 06 13:48:53 crc kubenswrapper[4867]: I1006 13:48:53.446002 4867 scope.go:117] "RemoveContainer" containerID="a97656b0298f092eefcf4621137ad1e31793211da442c8650518c230267669b3" Oct 06 13:48:54 crc kubenswrapper[4867]: I1006 13:48:54.056447 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:48:54 crc kubenswrapper[4867]: I1006 13:48:54.056500 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:48:55 crc kubenswrapper[4867]: I1006 13:48:55.121323 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p9mkz" podUID="cf28b048-e55f-450e-962e-dfd469b7e9a7" containerName="registry-server" probeResult="failure" output=< Oct 06 13:48:55 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Oct 06 13:48:55 crc kubenswrapper[4867]: > Oct 06 13:48:55 crc kubenswrapper[4867]: I1006 13:48:55.222945 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:48:55 crc kubenswrapper[4867]: I1006 13:48:55.256587 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="832ad292-cee7-4c56-89d4-edc633b608fd" path="/var/lib/kubelet/pods/832ad292-cee7-4c56-89d4-edc633b608fd/volumes" Oct 06 13:48:56 crc kubenswrapper[4867]: I1006 13:48:56.415575 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"6a57890df50f4067aad7fbbb27b3b2f13fa6a4de238cef56d72bff88beae742d"} Oct 06 13:49:04 crc kubenswrapper[4867]: I1006 13:49:04.140640 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:49:04 crc kubenswrapper[4867]: I1006 13:49:04.270822 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:49:04 crc kubenswrapper[4867]: I1006 13:49:04.413026 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p9mkz"] Oct 06 13:49:05 crc kubenswrapper[4867]: I1006 13:49:05.498490 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p9mkz" podUID="cf28b048-e55f-450e-962e-dfd469b7e9a7" containerName="registry-server" containerID="cri-o://f20e2686e34426ff6239eb0845cfe2c01bffbe152fe005b91fa25fc76dd194a9" gracePeriod=2 Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.039315 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.182075 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlcbb\" (UniqueName: \"kubernetes.io/projected/cf28b048-e55f-450e-962e-dfd469b7e9a7-kube-api-access-nlcbb\") pod \"cf28b048-e55f-450e-962e-dfd469b7e9a7\" (UID: \"cf28b048-e55f-450e-962e-dfd469b7e9a7\") " Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.182210 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf28b048-e55f-450e-962e-dfd469b7e9a7-catalog-content\") pod \"cf28b048-e55f-450e-962e-dfd469b7e9a7\" (UID: \"cf28b048-e55f-450e-962e-dfd469b7e9a7\") " Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.184746 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf28b048-e55f-450e-962e-dfd469b7e9a7-utilities\") pod \"cf28b048-e55f-450e-962e-dfd469b7e9a7\" (UID: \"cf28b048-e55f-450e-962e-dfd469b7e9a7\") " Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.185382 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf28b048-e55f-450e-962e-dfd469b7e9a7-utilities" (OuterVolumeSpecName: "utilities") pod "cf28b048-e55f-450e-962e-dfd469b7e9a7" (UID: "cf28b048-e55f-450e-962e-dfd469b7e9a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.186136 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf28b048-e55f-450e-962e-dfd469b7e9a7-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.190592 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf28b048-e55f-450e-962e-dfd469b7e9a7-kube-api-access-nlcbb" (OuterVolumeSpecName: "kube-api-access-nlcbb") pod "cf28b048-e55f-450e-962e-dfd469b7e9a7" (UID: "cf28b048-e55f-450e-962e-dfd469b7e9a7"). InnerVolumeSpecName "kube-api-access-nlcbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.256909 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf28b048-e55f-450e-962e-dfd469b7e9a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf28b048-e55f-450e-962e-dfd469b7e9a7" (UID: "cf28b048-e55f-450e-962e-dfd469b7e9a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.288575 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlcbb\" (UniqueName: \"kubernetes.io/projected/cf28b048-e55f-450e-962e-dfd469b7e9a7-kube-api-access-nlcbb\") on node \"crc\" DevicePath \"\"" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.288622 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf28b048-e55f-450e-962e-dfd469b7e9a7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.514717 4867 generic.go:334] "Generic (PLEG): container finished" podID="cf28b048-e55f-450e-962e-dfd469b7e9a7" containerID="f20e2686e34426ff6239eb0845cfe2c01bffbe152fe005b91fa25fc76dd194a9" exitCode=0 Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.514775 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p9mkz" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.514794 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p9mkz" event={"ID":"cf28b048-e55f-450e-962e-dfd469b7e9a7","Type":"ContainerDied","Data":"f20e2686e34426ff6239eb0845cfe2c01bffbe152fe005b91fa25fc76dd194a9"} Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.514853 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p9mkz" event={"ID":"cf28b048-e55f-450e-962e-dfd469b7e9a7","Type":"ContainerDied","Data":"87de1e8219cc4642de4c93148096b62741b81dd11c92f77b6f53a72c75b92bcb"} Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.514887 4867 scope.go:117] "RemoveContainer" containerID="f20e2686e34426ff6239eb0845cfe2c01bffbe152fe005b91fa25fc76dd194a9" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.544689 4867 scope.go:117] "RemoveContainer" containerID="ec82222c5f321d224c9740a504d03fc4468f0f748220c3be6a0370f448b292c9" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.552482 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p9mkz"] Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.562004 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p9mkz"] Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.589418 4867 scope.go:117] "RemoveContainer" containerID="dc8c3e466142c9d84b36b386d05cd309dabbd3283f7876b512939b6b79dba1c4" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.621284 4867 scope.go:117] "RemoveContainer" containerID="f20e2686e34426ff6239eb0845cfe2c01bffbe152fe005b91fa25fc76dd194a9" Oct 06 13:49:06 crc kubenswrapper[4867]: E1006 13:49:06.621742 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f20e2686e34426ff6239eb0845cfe2c01bffbe152fe005b91fa25fc76dd194a9\": container with ID starting with f20e2686e34426ff6239eb0845cfe2c01bffbe152fe005b91fa25fc76dd194a9 not found: ID does not exist" containerID="f20e2686e34426ff6239eb0845cfe2c01bffbe152fe005b91fa25fc76dd194a9" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.621792 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20e2686e34426ff6239eb0845cfe2c01bffbe152fe005b91fa25fc76dd194a9"} err="failed to get container status \"f20e2686e34426ff6239eb0845cfe2c01bffbe152fe005b91fa25fc76dd194a9\": rpc error: code = NotFound desc = could not find container \"f20e2686e34426ff6239eb0845cfe2c01bffbe152fe005b91fa25fc76dd194a9\": container with ID starting with f20e2686e34426ff6239eb0845cfe2c01bffbe152fe005b91fa25fc76dd194a9 not found: ID does not exist" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.621817 4867 scope.go:117] "RemoveContainer" containerID="ec82222c5f321d224c9740a504d03fc4468f0f748220c3be6a0370f448b292c9" Oct 06 13:49:06 crc kubenswrapper[4867]: E1006 13:49:06.622136 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec82222c5f321d224c9740a504d03fc4468f0f748220c3be6a0370f448b292c9\": container with ID starting with ec82222c5f321d224c9740a504d03fc4468f0f748220c3be6a0370f448b292c9 not found: ID does not exist" containerID="ec82222c5f321d224c9740a504d03fc4468f0f748220c3be6a0370f448b292c9" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.622168 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec82222c5f321d224c9740a504d03fc4468f0f748220c3be6a0370f448b292c9"} err="failed to get container status \"ec82222c5f321d224c9740a504d03fc4468f0f748220c3be6a0370f448b292c9\": rpc error: code = NotFound desc = could not find container \"ec82222c5f321d224c9740a504d03fc4468f0f748220c3be6a0370f448b292c9\": container with ID starting with ec82222c5f321d224c9740a504d03fc4468f0f748220c3be6a0370f448b292c9 not found: ID does not exist" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.622189 4867 scope.go:117] "RemoveContainer" containerID="dc8c3e466142c9d84b36b386d05cd309dabbd3283f7876b512939b6b79dba1c4" Oct 06 13:49:06 crc kubenswrapper[4867]: E1006 13:49:06.622425 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc8c3e466142c9d84b36b386d05cd309dabbd3283f7876b512939b6b79dba1c4\": container with ID starting with dc8c3e466142c9d84b36b386d05cd309dabbd3283f7876b512939b6b79dba1c4 not found: ID does not exist" containerID="dc8c3e466142c9d84b36b386d05cd309dabbd3283f7876b512939b6b79dba1c4" Oct 06 13:49:06 crc kubenswrapper[4867]: I1006 13:49:06.622455 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc8c3e466142c9d84b36b386d05cd309dabbd3283f7876b512939b6b79dba1c4"} err="failed to get container status \"dc8c3e466142c9d84b36b386d05cd309dabbd3283f7876b512939b6b79dba1c4\": rpc error: code = NotFound desc = could not find container \"dc8c3e466142c9d84b36b386d05cd309dabbd3283f7876b512939b6b79dba1c4\": container with ID starting with dc8c3e466142c9d84b36b386d05cd309dabbd3283f7876b512939b6b79dba1c4 not found: ID does not exist" Oct 06 13:49:07 crc kubenswrapper[4867]: I1006 13:49:07.238106 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf28b048-e55f-450e-962e-dfd469b7e9a7" path="/var/lib/kubelet/pods/cf28b048-e55f-450e-962e-dfd469b7e9a7/volumes" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.436070 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hvtfs"] Oct 06 13:49:20 crc kubenswrapper[4867]: E1006 13:49:20.437158 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832ad292-cee7-4c56-89d4-edc633b608fd" containerName="extract-content" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.437174 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="832ad292-cee7-4c56-89d4-edc633b608fd" containerName="extract-content" Oct 06 13:49:20 crc kubenswrapper[4867]: E1006 13:49:20.437193 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf28b048-e55f-450e-962e-dfd469b7e9a7" containerName="extract-utilities" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.437200 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf28b048-e55f-450e-962e-dfd469b7e9a7" containerName="extract-utilities" Oct 06 13:49:20 crc kubenswrapper[4867]: E1006 13:49:20.437228 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf28b048-e55f-450e-962e-dfd469b7e9a7" containerName="registry-server" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.437235 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf28b048-e55f-450e-962e-dfd469b7e9a7" containerName="registry-server" Oct 06 13:49:20 crc kubenswrapper[4867]: E1006 13:49:20.437289 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf28b048-e55f-450e-962e-dfd469b7e9a7" containerName="extract-content" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.437298 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf28b048-e55f-450e-962e-dfd469b7e9a7" containerName="extract-content" Oct 06 13:49:20 crc kubenswrapper[4867]: E1006 13:49:20.437308 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832ad292-cee7-4c56-89d4-edc633b608fd" containerName="extract-utilities" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.437315 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="832ad292-cee7-4c56-89d4-edc633b608fd" containerName="extract-utilities" Oct 06 13:49:20 crc kubenswrapper[4867]: E1006 13:49:20.437323 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832ad292-cee7-4c56-89d4-edc633b608fd" containerName="registry-server" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.437330 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="832ad292-cee7-4c56-89d4-edc633b608fd" containerName="registry-server" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.437566 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="832ad292-cee7-4c56-89d4-edc633b608fd" containerName="registry-server" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.437584 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf28b048-e55f-450e-962e-dfd469b7e9a7" containerName="registry-server" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.441787 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.456222 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvtfs"] Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.538159 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8859655-f6fd-460c-8ca8-5e8d2eb87866-catalog-content\") pod \"community-operators-hvtfs\" (UID: \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\") " pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.538223 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2gqn\" (UniqueName: \"kubernetes.io/projected/b8859655-f6fd-460c-8ca8-5e8d2eb87866-kube-api-access-x2gqn\") pod \"community-operators-hvtfs\" (UID: \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\") " pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.538414 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8859655-f6fd-460c-8ca8-5e8d2eb87866-utilities\") pod \"community-operators-hvtfs\" (UID: \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\") " pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.640655 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8859655-f6fd-460c-8ca8-5e8d2eb87866-catalog-content\") pod \"community-operators-hvtfs\" (UID: \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\") " pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.640734 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2gqn\" (UniqueName: \"kubernetes.io/projected/b8859655-f6fd-460c-8ca8-5e8d2eb87866-kube-api-access-x2gqn\") pod \"community-operators-hvtfs\" (UID: \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\") " pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.640762 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8859655-f6fd-460c-8ca8-5e8d2eb87866-utilities\") pod \"community-operators-hvtfs\" (UID: \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\") " pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.641451 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8859655-f6fd-460c-8ca8-5e8d2eb87866-catalog-content\") pod \"community-operators-hvtfs\" (UID: \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\") " pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.641499 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8859655-f6fd-460c-8ca8-5e8d2eb87866-utilities\") pod \"community-operators-hvtfs\" (UID: \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\") " pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.681958 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2gqn\" (UniqueName: \"kubernetes.io/projected/b8859655-f6fd-460c-8ca8-5e8d2eb87866-kube-api-access-x2gqn\") pod \"community-operators-hvtfs\" (UID: \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\") " pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:20 crc kubenswrapper[4867]: I1006 13:49:20.784691 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:21 crc kubenswrapper[4867]: I1006 13:49:21.382381 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvtfs"] Oct 06 13:49:21 crc kubenswrapper[4867]: I1006 13:49:21.740163 4867 generic.go:334] "Generic (PLEG): container finished" podID="b8859655-f6fd-460c-8ca8-5e8d2eb87866" containerID="6c1fbff03a4315b7bd9a7d7968b17457aba904bc5520a818c0106f067578bbd0" exitCode=0 Oct 06 13:49:21 crc kubenswrapper[4867]: I1006 13:49:21.740293 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvtfs" event={"ID":"b8859655-f6fd-460c-8ca8-5e8d2eb87866","Type":"ContainerDied","Data":"6c1fbff03a4315b7bd9a7d7968b17457aba904bc5520a818c0106f067578bbd0"} Oct 06 13:49:21 crc kubenswrapper[4867]: I1006 13:49:21.740652 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvtfs" event={"ID":"b8859655-f6fd-460c-8ca8-5e8d2eb87866","Type":"ContainerStarted","Data":"abcb34a201876395c2009d4f7d4522b3adef77a5f67a574d474d5bc297b3c733"} Oct 06 13:49:21 crc kubenswrapper[4867]: I1006 13:49:21.743164 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:49:22 crc kubenswrapper[4867]: I1006 13:49:22.752976 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvtfs" event={"ID":"b8859655-f6fd-460c-8ca8-5e8d2eb87866","Type":"ContainerStarted","Data":"7c5793998618721a74e3a1ba29b816e1b6e7c658d98458c93dacdf0c17f128ec"} Oct 06 13:49:23 crc kubenswrapper[4867]: I1006 13:49:23.765900 4867 generic.go:334] "Generic (PLEG): container finished" podID="b8859655-f6fd-460c-8ca8-5e8d2eb87866" containerID="7c5793998618721a74e3a1ba29b816e1b6e7c658d98458c93dacdf0c17f128ec" exitCode=0 Oct 06 13:49:23 crc kubenswrapper[4867]: I1006 13:49:23.766023 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvtfs" event={"ID":"b8859655-f6fd-460c-8ca8-5e8d2eb87866","Type":"ContainerDied","Data":"7c5793998618721a74e3a1ba29b816e1b6e7c658d98458c93dacdf0c17f128ec"} Oct 06 13:49:24 crc kubenswrapper[4867]: I1006 13:49:24.779087 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvtfs" event={"ID":"b8859655-f6fd-460c-8ca8-5e8d2eb87866","Type":"ContainerStarted","Data":"5cf2a322a4978ef7abbcf9e37135e3a5b1b60876bd44383622c6d80760f80880"} Oct 06 13:49:24 crc kubenswrapper[4867]: I1006 13:49:24.809850 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hvtfs" podStartSLOduration=2.244700107 podStartE2EDuration="4.809819529s" podCreationTimestamp="2025-10-06 13:49:20 +0000 UTC" firstStartedPulling="2025-10-06 13:49:21.7428841 +0000 UTC m=+2741.200832234" lastFinishedPulling="2025-10-06 13:49:24.308003512 +0000 UTC m=+2743.765951656" observedRunningTime="2025-10-06 13:49:24.80322563 +0000 UTC m=+2744.261173764" watchObservedRunningTime="2025-10-06 13:49:24.809819529 +0000 UTC m=+2744.267767693" Oct 06 13:49:30 crc kubenswrapper[4867]: I1006 13:49:30.785712 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:30 crc kubenswrapper[4867]: I1006 13:49:30.787549 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:30 crc kubenswrapper[4867]: I1006 13:49:30.847967 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:31 crc kubenswrapper[4867]: I1006 13:49:31.895992 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:31 crc kubenswrapper[4867]: I1006 13:49:31.949565 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvtfs"] Oct 06 13:49:33 crc kubenswrapper[4867]: I1006 13:49:33.864017 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hvtfs" podUID="b8859655-f6fd-460c-8ca8-5e8d2eb87866" containerName="registry-server" containerID="cri-o://5cf2a322a4978ef7abbcf9e37135e3a5b1b60876bd44383622c6d80760f80880" gracePeriod=2 Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.384217 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.456836 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8859655-f6fd-460c-8ca8-5e8d2eb87866-catalog-content\") pod \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\" (UID: \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\") " Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.457050 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2gqn\" (UniqueName: \"kubernetes.io/projected/b8859655-f6fd-460c-8ca8-5e8d2eb87866-kube-api-access-x2gqn\") pod \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\" (UID: \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\") " Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.457205 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8859655-f6fd-460c-8ca8-5e8d2eb87866-utilities\") pod \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\" (UID: \"b8859655-f6fd-460c-8ca8-5e8d2eb87866\") " Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.458094 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8859655-f6fd-460c-8ca8-5e8d2eb87866-utilities" (OuterVolumeSpecName: "utilities") pod "b8859655-f6fd-460c-8ca8-5e8d2eb87866" (UID: "b8859655-f6fd-460c-8ca8-5e8d2eb87866"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.458483 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8859655-f6fd-460c-8ca8-5e8d2eb87866-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.482414 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8859655-f6fd-460c-8ca8-5e8d2eb87866-kube-api-access-x2gqn" (OuterVolumeSpecName: "kube-api-access-x2gqn") pod "b8859655-f6fd-460c-8ca8-5e8d2eb87866" (UID: "b8859655-f6fd-460c-8ca8-5e8d2eb87866"). InnerVolumeSpecName "kube-api-access-x2gqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.523707 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8859655-f6fd-460c-8ca8-5e8d2eb87866-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8859655-f6fd-460c-8ca8-5e8d2eb87866" (UID: "b8859655-f6fd-460c-8ca8-5e8d2eb87866"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.560605 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8859655-f6fd-460c-8ca8-5e8d2eb87866-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.560649 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2gqn\" (UniqueName: \"kubernetes.io/projected/b8859655-f6fd-460c-8ca8-5e8d2eb87866-kube-api-access-x2gqn\") on node \"crc\" DevicePath \"\"" Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.879596 4867 generic.go:334] "Generic (PLEG): container finished" podID="b8859655-f6fd-460c-8ca8-5e8d2eb87866" containerID="5cf2a322a4978ef7abbcf9e37135e3a5b1b60876bd44383622c6d80760f80880" exitCode=0 Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.879641 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvtfs" event={"ID":"b8859655-f6fd-460c-8ca8-5e8d2eb87866","Type":"ContainerDied","Data":"5cf2a322a4978ef7abbcf9e37135e3a5b1b60876bd44383622c6d80760f80880"} Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.879675 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvtfs" event={"ID":"b8859655-f6fd-460c-8ca8-5e8d2eb87866","Type":"ContainerDied","Data":"abcb34a201876395c2009d4f7d4522b3adef77a5f67a574d474d5bc297b3c733"} Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.879696 4867 scope.go:117] "RemoveContainer" containerID="5cf2a322a4978ef7abbcf9e37135e3a5b1b60876bd44383622c6d80760f80880" Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.879715 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvtfs" Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.954182 4867 scope.go:117] "RemoveContainer" containerID="7c5793998618721a74e3a1ba29b816e1b6e7c658d98458c93dacdf0c17f128ec" Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.955809 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvtfs"] Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.968389 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hvtfs"] Oct 06 13:49:34 crc kubenswrapper[4867]: I1006 13:49:34.987021 4867 scope.go:117] "RemoveContainer" containerID="6c1fbff03a4315b7bd9a7d7968b17457aba904bc5520a818c0106f067578bbd0" Oct 06 13:49:35 crc kubenswrapper[4867]: I1006 13:49:35.026560 4867 scope.go:117] "RemoveContainer" containerID="5cf2a322a4978ef7abbcf9e37135e3a5b1b60876bd44383622c6d80760f80880" Oct 06 13:49:35 crc kubenswrapper[4867]: E1006 13:49:35.027058 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf2a322a4978ef7abbcf9e37135e3a5b1b60876bd44383622c6d80760f80880\": container with ID starting with 5cf2a322a4978ef7abbcf9e37135e3a5b1b60876bd44383622c6d80760f80880 not found: ID does not exist" containerID="5cf2a322a4978ef7abbcf9e37135e3a5b1b60876bd44383622c6d80760f80880" Oct 06 13:49:35 crc kubenswrapper[4867]: I1006 13:49:35.027109 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf2a322a4978ef7abbcf9e37135e3a5b1b60876bd44383622c6d80760f80880"} err="failed to get container status \"5cf2a322a4978ef7abbcf9e37135e3a5b1b60876bd44383622c6d80760f80880\": rpc error: code = NotFound desc = could not find container \"5cf2a322a4978ef7abbcf9e37135e3a5b1b60876bd44383622c6d80760f80880\": container with ID starting with 5cf2a322a4978ef7abbcf9e37135e3a5b1b60876bd44383622c6d80760f80880 not found: ID does not exist" Oct 06 13:49:35 crc kubenswrapper[4867]: I1006 13:49:35.027144 4867 scope.go:117] "RemoveContainer" containerID="7c5793998618721a74e3a1ba29b816e1b6e7c658d98458c93dacdf0c17f128ec" Oct 06 13:49:35 crc kubenswrapper[4867]: E1006 13:49:35.027695 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c5793998618721a74e3a1ba29b816e1b6e7c658d98458c93dacdf0c17f128ec\": container with ID starting with 7c5793998618721a74e3a1ba29b816e1b6e7c658d98458c93dacdf0c17f128ec not found: ID does not exist" containerID="7c5793998618721a74e3a1ba29b816e1b6e7c658d98458c93dacdf0c17f128ec" Oct 06 13:49:35 crc kubenswrapper[4867]: I1006 13:49:35.027729 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c5793998618721a74e3a1ba29b816e1b6e7c658d98458c93dacdf0c17f128ec"} err="failed to get container status \"7c5793998618721a74e3a1ba29b816e1b6e7c658d98458c93dacdf0c17f128ec\": rpc error: code = NotFound desc = could not find container \"7c5793998618721a74e3a1ba29b816e1b6e7c658d98458c93dacdf0c17f128ec\": container with ID starting with 7c5793998618721a74e3a1ba29b816e1b6e7c658d98458c93dacdf0c17f128ec not found: ID does not exist" Oct 06 13:49:35 crc kubenswrapper[4867]: I1006 13:49:35.027752 4867 scope.go:117] "RemoveContainer" containerID="6c1fbff03a4315b7bd9a7d7968b17457aba904bc5520a818c0106f067578bbd0" Oct 06 13:49:35 crc kubenswrapper[4867]: E1006 13:49:35.028335 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c1fbff03a4315b7bd9a7d7968b17457aba904bc5520a818c0106f067578bbd0\": container with ID starting with 6c1fbff03a4315b7bd9a7d7968b17457aba904bc5520a818c0106f067578bbd0 not found: ID does not exist" containerID="6c1fbff03a4315b7bd9a7d7968b17457aba904bc5520a818c0106f067578bbd0" Oct 06 13:49:35 crc kubenswrapper[4867]: I1006 13:49:35.028379 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1fbff03a4315b7bd9a7d7968b17457aba904bc5520a818c0106f067578bbd0"} err="failed to get container status \"6c1fbff03a4315b7bd9a7d7968b17457aba904bc5520a818c0106f067578bbd0\": rpc error: code = NotFound desc = could not find container \"6c1fbff03a4315b7bd9a7d7968b17457aba904bc5520a818c0106f067578bbd0\": container with ID starting with 6c1fbff03a4315b7bd9a7d7968b17457aba904bc5520a818c0106f067578bbd0 not found: ID does not exist" Oct 06 13:49:35 crc kubenswrapper[4867]: I1006 13:49:35.239889 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8859655-f6fd-460c-8ca8-5e8d2eb87866" path="/var/lib/kubelet/pods/b8859655-f6fd-460c-8ca8-5e8d2eb87866/volumes" Oct 06 13:51:02 crc kubenswrapper[4867]: I1006 13:51:02.893698 4867 generic.go:334] "Generic (PLEG): container finished" podID="a13c4977-6a03-4678-b394-0b33d74ee2a8" containerID="c5d3b3a5c3467b5f4f159579b71edb66b8fb4308791b9b5efe53d7da2f6257a9" exitCode=0 Oct 06 13:51:02 crc kubenswrapper[4867]: I1006 13:51:02.893778 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" event={"ID":"a13c4977-6a03-4678-b394-0b33d74ee2a8","Type":"ContainerDied","Data":"c5d3b3a5c3467b5f4f159579b71edb66b8fb4308791b9b5efe53d7da2f6257a9"} Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.299228 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.415951 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-2\") pod \"a13c4977-6a03-4678-b394-0b33d74ee2a8\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.416370 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-0\") pod \"a13c4977-6a03-4678-b394-0b33d74ee2a8\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.416429 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-1\") pod \"a13c4977-6a03-4678-b394-0b33d74ee2a8\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.416534 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-telemetry-combined-ca-bundle\") pod \"a13c4977-6a03-4678-b394-0b33d74ee2a8\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.416654 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ssh-key\") pod \"a13c4977-6a03-4678-b394-0b33d74ee2a8\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.416691 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnk5b\" (UniqueName: \"kubernetes.io/projected/a13c4977-6a03-4678-b394-0b33d74ee2a8-kube-api-access-mnk5b\") pod \"a13c4977-6a03-4678-b394-0b33d74ee2a8\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.416779 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-inventory\") pod \"a13c4977-6a03-4678-b394-0b33d74ee2a8\" (UID: \"a13c4977-6a03-4678-b394-0b33d74ee2a8\") " Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.422812 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a13c4977-6a03-4678-b394-0b33d74ee2a8" (UID: "a13c4977-6a03-4678-b394-0b33d74ee2a8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.430646 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13c4977-6a03-4678-b394-0b33d74ee2a8-kube-api-access-mnk5b" (OuterVolumeSpecName: "kube-api-access-mnk5b") pod "a13c4977-6a03-4678-b394-0b33d74ee2a8" (UID: "a13c4977-6a03-4678-b394-0b33d74ee2a8"). InnerVolumeSpecName "kube-api-access-mnk5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.450126 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a13c4977-6a03-4678-b394-0b33d74ee2a8" (UID: "a13c4977-6a03-4678-b394-0b33d74ee2a8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.452339 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-inventory" (OuterVolumeSpecName: "inventory") pod "a13c4977-6a03-4678-b394-0b33d74ee2a8" (UID: "a13c4977-6a03-4678-b394-0b33d74ee2a8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.456451 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a13c4977-6a03-4678-b394-0b33d74ee2a8" (UID: "a13c4977-6a03-4678-b394-0b33d74ee2a8"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.456479 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a13c4977-6a03-4678-b394-0b33d74ee2a8" (UID: "a13c4977-6a03-4678-b394-0b33d74ee2a8"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.456495 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a13c4977-6a03-4678-b394-0b33d74ee2a8" (UID: "a13c4977-6a03-4678-b394-0b33d74ee2a8"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.519552 4867 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.519584 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.519595 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnk5b\" (UniqueName: \"kubernetes.io/projected/a13c4977-6a03-4678-b394-0b33d74ee2a8-kube-api-access-mnk5b\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.519604 4867 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.519612 4867 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.519624 4867 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.519634 4867 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a13c4977-6a03-4678-b394-0b33d74ee2a8-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.916083 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.916016 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t" event={"ID":"a13c4977-6a03-4678-b394-0b33d74ee2a8","Type":"ContainerDied","Data":"213030e53c3b158eba338c3e14aa415f762e2eec5d20529a0f4c153ab214dbff"} Oct 06 13:51:04 crc kubenswrapper[4867]: I1006 13:51:04.916496 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="213030e53c3b158eba338c3e14aa415f762e2eec5d20529a0f4c153ab214dbff" Oct 06 13:51:12 crc kubenswrapper[4867]: I1006 13:51:12.873171 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:51:12 crc kubenswrapper[4867]: I1006 13:51:12.873806 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:51:42 crc kubenswrapper[4867]: I1006 13:51:42.032735 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:51:42 crc kubenswrapper[4867]: I1006 13:51:42.033713 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerName="prometheus" containerID="cri-o://9f31e63f5b82c4a8df364cafb166381b10241cc8cbe912fe523b493ae6e21951" gracePeriod=600 Oct 06 13:51:42 crc kubenswrapper[4867]: I1006 13:51:42.033877 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerName="thanos-sidecar" containerID="cri-o://064974d3fb6315a201c9f6a24226589e556c74f259c14d0eb8d3c898db9995d7" gracePeriod=600 Oct 06 13:51:42 crc kubenswrapper[4867]: I1006 13:51:42.033990 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerName="config-reloader" containerID="cri-o://b53990b0029bf080c6eba2b5ad95272a836e4fb7e74db4d7ead2f1f083d0de16" gracePeriod=600 Oct 06 13:51:42 crc kubenswrapper[4867]: I1006 13:51:42.308612 4867 generic.go:334] "Generic (PLEG): container finished" podID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerID="064974d3fb6315a201c9f6a24226589e556c74f259c14d0eb8d3c898db9995d7" exitCode=0 Oct 06 13:51:42 crc kubenswrapper[4867]: I1006 13:51:42.308937 4867 generic.go:334] "Generic (PLEG): container finished" podID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerID="9f31e63f5b82c4a8df364cafb166381b10241cc8cbe912fe523b493ae6e21951" exitCode=0 Oct 06 13:51:42 crc kubenswrapper[4867]: I1006 13:51:42.308708 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"142692d3-42d3-469a-ab1e-e24752dd0b11","Type":"ContainerDied","Data":"064974d3fb6315a201c9f6a24226589e556c74f259c14d0eb8d3c898db9995d7"} Oct 06 13:51:42 crc kubenswrapper[4867]: I1006 13:51:42.308973 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"142692d3-42d3-469a-ab1e-e24752dd0b11","Type":"ContainerDied","Data":"9f31e63f5b82c4a8df364cafb166381b10241cc8cbe912fe523b493ae6e21951"} Oct 06 13:51:42 crc kubenswrapper[4867]: I1006 13:51:42.873363 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:51:42 crc kubenswrapper[4867]: I1006 13:51:42.873864 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.323714 4867 generic.go:334] "Generic (PLEG): container finished" podID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerID="b53990b0029bf080c6eba2b5ad95272a836e4fb7e74db4d7ead2f1f083d0de16" exitCode=0 Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.323771 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"142692d3-42d3-469a-ab1e-e24752dd0b11","Type":"ContainerDied","Data":"b53990b0029bf080c6eba2b5ad95272a836e4fb7e74db4d7ead2f1f083d0de16"} Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.323808 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"142692d3-42d3-469a-ab1e-e24752dd0b11","Type":"ContainerDied","Data":"f3578a41a1be63f4d3044a21d8333b539c020db3d7bc82567588c1d561871946"} Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.323822 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3578a41a1be63f4d3044a21d8333b539c020db3d7bc82567588c1d561871946" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.327773 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.434757 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-thanos-prometheus-http-client-file\") pod \"142692d3-42d3-469a-ab1e-e24752dd0b11\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.434862 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"142692d3-42d3-469a-ab1e-e24752dd0b11\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.434908 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/142692d3-42d3-469a-ab1e-e24752dd0b11-prometheus-metric-storage-rulefiles-0\") pod \"142692d3-42d3-469a-ab1e-e24752dd0b11\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.434950 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"142692d3-42d3-469a-ab1e-e24752dd0b11\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.434993 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config\") pod \"142692d3-42d3-469a-ab1e-e24752dd0b11\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.435014 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/142692d3-42d3-469a-ab1e-e24752dd0b11-config-out\") pod \"142692d3-42d3-469a-ab1e-e24752dd0b11\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.435047 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7g6t\" (UniqueName: \"kubernetes.io/projected/142692d3-42d3-469a-ab1e-e24752dd0b11-kube-api-access-g7g6t\") pod \"142692d3-42d3-469a-ab1e-e24752dd0b11\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.435117 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/142692d3-42d3-469a-ab1e-e24752dd0b11-tls-assets\") pod \"142692d3-42d3-469a-ab1e-e24752dd0b11\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.435934 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") pod \"142692d3-42d3-469a-ab1e-e24752dd0b11\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.435971 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-config\") pod \"142692d3-42d3-469a-ab1e-e24752dd0b11\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.435992 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-secret-combined-ca-bundle\") pod \"142692d3-42d3-469a-ab1e-e24752dd0b11\" (UID: \"142692d3-42d3-469a-ab1e-e24752dd0b11\") " Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.436600 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/142692d3-42d3-469a-ab1e-e24752dd0b11-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "142692d3-42d3-469a-ab1e-e24752dd0b11" (UID: "142692d3-42d3-469a-ab1e-e24752dd0b11"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.444455 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/142692d3-42d3-469a-ab1e-e24752dd0b11-config-out" (OuterVolumeSpecName: "config-out") pod "142692d3-42d3-469a-ab1e-e24752dd0b11" (UID: "142692d3-42d3-469a-ab1e-e24752dd0b11"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.444918 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-config" (OuterVolumeSpecName: "config") pod "142692d3-42d3-469a-ab1e-e24752dd0b11" (UID: "142692d3-42d3-469a-ab1e-e24752dd0b11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.445038 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "142692d3-42d3-469a-ab1e-e24752dd0b11" (UID: "142692d3-42d3-469a-ab1e-e24752dd0b11"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.445141 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/142692d3-42d3-469a-ab1e-e24752dd0b11-kube-api-access-g7g6t" (OuterVolumeSpecName: "kube-api-access-g7g6t") pod "142692d3-42d3-469a-ab1e-e24752dd0b11" (UID: "142692d3-42d3-469a-ab1e-e24752dd0b11"). InnerVolumeSpecName "kube-api-access-g7g6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.445631 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "142692d3-42d3-469a-ab1e-e24752dd0b11" (UID: "142692d3-42d3-469a-ab1e-e24752dd0b11"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.448667 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "142692d3-42d3-469a-ab1e-e24752dd0b11" (UID: "142692d3-42d3-469a-ab1e-e24752dd0b11"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.449309 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "142692d3-42d3-469a-ab1e-e24752dd0b11" (UID: "142692d3-42d3-469a-ab1e-e24752dd0b11"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.449458 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/142692d3-42d3-469a-ab1e-e24752dd0b11-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "142692d3-42d3-469a-ab1e-e24752dd0b11" (UID: "142692d3-42d3-469a-ab1e-e24752dd0b11"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.483995 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "142692d3-42d3-469a-ab1e-e24752dd0b11" (UID: "142692d3-42d3-469a-ab1e-e24752dd0b11"). InnerVolumeSpecName "pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.538380 4867 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.538443 4867 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.538460 4867 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/142692d3-42d3-469a-ab1e-e24752dd0b11-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.538472 4867 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.538485 4867 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/142692d3-42d3-469a-ab1e-e24752dd0b11-config-out\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.538496 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7g6t\" (UniqueName: \"kubernetes.io/projected/142692d3-42d3-469a-ab1e-e24752dd0b11-kube-api-access-g7g6t\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.538507 4867 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/142692d3-42d3-469a-ab1e-e24752dd0b11-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.538548 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") on node \"crc\" " Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.538561 4867 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.538573 4867 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.560564 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config" (OuterVolumeSpecName: "web-config") pod "142692d3-42d3-469a-ab1e-e24752dd0b11" (UID: "142692d3-42d3-469a-ab1e-e24752dd0b11"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.583369 4867 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.583643 4867 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4") on node "crc" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.641888 4867 reconciler_common.go:293] "Volume detached for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:43 crc kubenswrapper[4867]: I1006 13:51:43.641941 4867 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/142692d3-42d3-469a-ab1e-e24752dd0b11-web-config\") on node \"crc\" DevicePath \"\"" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.334281 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.372863 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.385185 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.411941 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:51:44 crc kubenswrapper[4867]: E1006 13:51:44.412753 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8859655-f6fd-460c-8ca8-5e8d2eb87866" containerName="extract-utilities" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.412838 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8859655-f6fd-460c-8ca8-5e8d2eb87866" containerName="extract-utilities" Oct 06 13:51:44 crc kubenswrapper[4867]: E1006 13:51:44.412908 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerName="init-config-reloader" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.412971 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerName="init-config-reloader" Oct 06 13:51:44 crc kubenswrapper[4867]: E1006 13:51:44.413031 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13c4977-6a03-4678-b394-0b33d74ee2a8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.413088 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13c4977-6a03-4678-b394-0b33d74ee2a8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 06 13:51:44 crc kubenswrapper[4867]: E1006 13:51:44.413152 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerName="prometheus" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.413210 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerName="prometheus" Oct 06 13:51:44 crc kubenswrapper[4867]: E1006 13:51:44.413289 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerName="config-reloader" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.413352 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerName="config-reloader" Oct 06 13:51:44 crc kubenswrapper[4867]: E1006 13:51:44.413420 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8859655-f6fd-460c-8ca8-5e8d2eb87866" containerName="registry-server" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.413477 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8859655-f6fd-460c-8ca8-5e8d2eb87866" containerName="registry-server" Oct 06 13:51:44 crc kubenswrapper[4867]: E1006 13:51:44.413535 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8859655-f6fd-460c-8ca8-5e8d2eb87866" containerName="extract-content" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.413590 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8859655-f6fd-460c-8ca8-5e8d2eb87866" containerName="extract-content" Oct 06 13:51:44 crc kubenswrapper[4867]: E1006 13:51:44.413661 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerName="thanos-sidecar" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.413712 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerName="thanos-sidecar" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.414086 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerName="config-reloader" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.414177 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13c4977-6a03-4678-b394-0b33d74ee2a8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.414300 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8859655-f6fd-460c-8ca8-5e8d2eb87866" containerName="registry-server" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.414394 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerName="thanos-sidecar" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.414473 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" containerName="prometheus" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.416585 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.419350 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-hkmxs" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.419430 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.419507 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.419361 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.427568 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.436071 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.451962 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.595671 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.595786 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7d54862f-97ef-4958-8b56-4f6f590fc7da-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.595815 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.595889 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.595945 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.596024 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-config\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.596132 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7d54862f-97ef-4958-8b56-4f6f590fc7da-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.596193 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7d54862f-97ef-4958-8b56-4f6f590fc7da-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.596233 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slrvl\" (UniqueName: \"kubernetes.io/projected/7d54862f-97ef-4958-8b56-4f6f590fc7da-kube-api-access-slrvl\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.596305 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.596560 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.698985 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.699425 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.699480 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-config\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.699551 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7d54862f-97ef-4958-8b56-4f6f590fc7da-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.699600 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7d54862f-97ef-4958-8b56-4f6f590fc7da-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.699635 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slrvl\" (UniqueName: \"kubernetes.io/projected/7d54862f-97ef-4958-8b56-4f6f590fc7da-kube-api-access-slrvl\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.699667 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.699762 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.699842 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.699881 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7d54862f-97ef-4958-8b56-4f6f590fc7da-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.699901 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.701024 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7d54862f-97ef-4958-8b56-4f6f590fc7da-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.706018 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7d54862f-97ef-4958-8b56-4f6f590fc7da-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.707948 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-config\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.707967 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.708048 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.708156 4867 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.708186 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.708194 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8b1bb68adf8576a50f9d1afe1558762f141c90adcfe42ae323643ac07b58f5a8/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.713031 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.714414 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7d54862f-97ef-4958-8b56-4f6f590fc7da-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.716862 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7d54862f-97ef-4958-8b56-4f6f590fc7da-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.717368 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slrvl\" (UniqueName: \"kubernetes.io/projected/7d54862f-97ef-4958-8b56-4f6f590fc7da-kube-api-access-slrvl\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.763807 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8873020c-5283-4bb6-8b82-65abe39a7cf4\") pod \"prometheus-metric-storage-0\" (UID: \"7d54862f-97ef-4958-8b56-4f6f590fc7da\") " pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:44 crc kubenswrapper[4867]: I1006 13:51:44.812573 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 13:51:45 crc kubenswrapper[4867]: I1006 13:51:45.235263 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="142692d3-42d3-469a-ab1e-e24752dd0b11" path="/var/lib/kubelet/pods/142692d3-42d3-469a-ab1e-e24752dd0b11/volumes" Oct 06 13:51:45 crc kubenswrapper[4867]: I1006 13:51:45.325821 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 13:51:46 crc kubenswrapper[4867]: I1006 13:51:46.360504 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7d54862f-97ef-4958-8b56-4f6f590fc7da","Type":"ContainerStarted","Data":"c9f3a5e1df33c91867087e71cd759ce6439a186d4b72cc88f1c57b149e876943"} Oct 06 13:51:49 crc kubenswrapper[4867]: I1006 13:51:49.389068 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7d54862f-97ef-4958-8b56-4f6f590fc7da","Type":"ContainerStarted","Data":"814cb145873104fd3fd52d3ec34191d9bbf961455ee9f1731c46d5d9c1218ab2"} Oct 06 13:51:50 crc kubenswrapper[4867]: E1006 13:51:50.509581 4867 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.198:33752->38.102.83.198:45409: write tcp 38.102.83.198:33752->38.102.83.198:45409: write: broken pipe Oct 06 13:51:57 crc kubenswrapper[4867]: I1006 13:51:57.490542 4867 generic.go:334] "Generic (PLEG): container finished" podID="7d54862f-97ef-4958-8b56-4f6f590fc7da" containerID="814cb145873104fd3fd52d3ec34191d9bbf961455ee9f1731c46d5d9c1218ab2" exitCode=0 Oct 06 13:51:57 crc kubenswrapper[4867]: I1006 13:51:57.490688 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7d54862f-97ef-4958-8b56-4f6f590fc7da","Type":"ContainerDied","Data":"814cb145873104fd3fd52d3ec34191d9bbf961455ee9f1731c46d5d9c1218ab2"} Oct 06 13:51:58 crc kubenswrapper[4867]: I1006 13:51:58.530431 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7d54862f-97ef-4958-8b56-4f6f590fc7da","Type":"ContainerStarted","Data":"3e35bd067fdca19e0c1cb3b8e4ee5856ad198b8b1e84a9a38478a2189a15b3bf"} Oct 06 13:52:02 crc kubenswrapper[4867]: I1006 13:52:02.566557 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7d54862f-97ef-4958-8b56-4f6f590fc7da","Type":"ContainerStarted","Data":"9bb9c35137cb42198dac270fa247be9c919702b6c819f43085ea3c54956c2948"} Oct 06 13:52:02 crc kubenswrapper[4867]: I1006 13:52:02.567105 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7d54862f-97ef-4958-8b56-4f6f590fc7da","Type":"ContainerStarted","Data":"95c882467430455bd3ccbb93ce1aa70db0e57846898317b8686fc7b9871654ae"} Oct 06 13:52:02 crc kubenswrapper[4867]: I1006 13:52:02.607218 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.607153076 podStartE2EDuration="18.607153076s" podCreationTimestamp="2025-10-06 13:51:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:52:02.597432892 +0000 UTC m=+2902.055381036" watchObservedRunningTime="2025-10-06 13:52:02.607153076 +0000 UTC m=+2902.065101220" Oct 06 13:52:02 crc kubenswrapper[4867]: I1006 13:52:02.880908 4867 scope.go:117] "RemoveContainer" containerID="75d0acf68d2e15dcf7d71ede8da663ced249c84d0fce069f660f6cbdd38884f6" Oct 06 13:52:02 crc kubenswrapper[4867]: I1006 13:52:02.910284 4867 scope.go:117] "RemoveContainer" containerID="064974d3fb6315a201c9f6a24226589e556c74f259c14d0eb8d3c898db9995d7" Oct 06 13:52:02 crc kubenswrapper[4867]: I1006 13:52:02.982870 4867 scope.go:117] "RemoveContainer" containerID="b53990b0029bf080c6eba2b5ad95272a836e4fb7e74db4d7ead2f1f083d0de16" Oct 06 13:52:03 crc kubenswrapper[4867]: I1006 13:52:03.008885 4867 scope.go:117] "RemoveContainer" containerID="9f31e63f5b82c4a8df364cafb166381b10241cc8cbe912fe523b493ae6e21951" Oct 06 13:52:04 crc kubenswrapper[4867]: I1006 13:52:04.813565 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 06 13:52:12 crc kubenswrapper[4867]: I1006 13:52:12.873545 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:52:12 crc kubenswrapper[4867]: I1006 13:52:12.874503 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:52:12 crc kubenswrapper[4867]: I1006 13:52:12.874619 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:52:12 crc kubenswrapper[4867]: I1006 13:52:12.876361 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a57890df50f4067aad7fbbb27b3b2f13fa6a4de238cef56d72bff88beae742d"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:52:12 crc kubenswrapper[4867]: I1006 13:52:12.876446 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://6a57890df50f4067aad7fbbb27b3b2f13fa6a4de238cef56d72bff88beae742d" gracePeriod=600 Oct 06 13:52:13 crc kubenswrapper[4867]: I1006 13:52:13.692217 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="6a57890df50f4067aad7fbbb27b3b2f13fa6a4de238cef56d72bff88beae742d" exitCode=0 Oct 06 13:52:13 crc kubenswrapper[4867]: I1006 13:52:13.692428 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"6a57890df50f4067aad7fbbb27b3b2f13fa6a4de238cef56d72bff88beae742d"} Oct 06 13:52:13 crc kubenswrapper[4867]: I1006 13:52:13.692871 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522"} Oct 06 13:52:13 crc kubenswrapper[4867]: I1006 13:52:13.692913 4867 scope.go:117] "RemoveContainer" containerID="48cc553e0399e11ee09ba105fb7146a7751853e1ce6fbfea3ccdd58beb81988d" Oct 06 13:52:14 crc kubenswrapper[4867]: I1006 13:52:14.813841 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 06 13:52:14 crc kubenswrapper[4867]: I1006 13:52:14.819633 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 06 13:52:15 crc kubenswrapper[4867]: I1006 13:52:15.726521 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 06 13:52:20 crc kubenswrapper[4867]: I1006 13:52:20.924070 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 13:52:20 crc kubenswrapper[4867]: I1006 13:52:20.926361 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 13:52:20 crc kubenswrapper[4867]: I1006 13:52:20.927952 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-n7mnk" Oct 06 13:52:20 crc kubenswrapper[4867]: I1006 13:52:20.929846 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 06 13:52:20 crc kubenswrapper[4867]: I1006 13:52:20.930049 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 06 13:52:20 crc kubenswrapper[4867]: I1006 13:52:20.930541 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 06 13:52:20 crc kubenswrapper[4867]: I1006 13:52:20.942282 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.027493 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.027931 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/43684055-87e6-4568-8a80-8019600aaeef-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.029830 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43684055-87e6-4568-8a80-8019600aaeef-config-data\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.030120 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/43684055-87e6-4568-8a80-8019600aaeef-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.030308 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.030401 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.030539 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.030660 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g778f\" (UniqueName: \"kubernetes.io/projected/43684055-87e6-4568-8a80-8019600aaeef-kube-api-access-g778f\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.030793 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/43684055-87e6-4568-8a80-8019600aaeef-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.132702 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.132768 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.132832 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.132878 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g778f\" (UniqueName: \"kubernetes.io/projected/43684055-87e6-4568-8a80-8019600aaeef-kube-api-access-g778f\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.132933 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/43684055-87e6-4568-8a80-8019600aaeef-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.133008 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.133035 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/43684055-87e6-4568-8a80-8019600aaeef-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.133078 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43684055-87e6-4568-8a80-8019600aaeef-config-data\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.133146 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/43684055-87e6-4568-8a80-8019600aaeef-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.134286 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/43684055-87e6-4568-8a80-8019600aaeef-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.134942 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/43684055-87e6-4568-8a80-8019600aaeef-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.135672 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.136067 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/43684055-87e6-4568-8a80-8019600aaeef-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.136368 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43684055-87e6-4568-8a80-8019600aaeef-config-data\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.143959 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.151233 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.154806 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.168701 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g778f\" (UniqueName: \"kubernetes.io/projected/43684055-87e6-4568-8a80-8019600aaeef-kube-api-access-g778f\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.203875 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.259445 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.700715 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 13:52:21 crc kubenswrapper[4867]: I1006 13:52:21.813167 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"43684055-87e6-4568-8a80-8019600aaeef","Type":"ContainerStarted","Data":"64137c40f37792976ecc2072fe48b50ea71728e06246f4964cb79f261125f077"} Oct 06 13:52:33 crc kubenswrapper[4867]: I1006 13:52:33.952611 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"43684055-87e6-4568-8a80-8019600aaeef","Type":"ContainerStarted","Data":"80cda36f8946522fe0e1cac3986cc232aee88e99304fb948bc0d9a610cc4f637"} Oct 06 13:52:33 crc kubenswrapper[4867]: I1006 13:52:33.969220 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.9990972449999997 podStartE2EDuration="14.969195061s" podCreationTimestamp="2025-10-06 13:52:19 +0000 UTC" firstStartedPulling="2025-10-06 13:52:21.712676004 +0000 UTC m=+2921.170624158" lastFinishedPulling="2025-10-06 13:52:32.68277383 +0000 UTC m=+2932.140721974" observedRunningTime="2025-10-06 13:52:33.966714884 +0000 UTC m=+2933.424663028" watchObservedRunningTime="2025-10-06 13:52:33.969195061 +0000 UTC m=+2933.427143215" Oct 06 13:54:19 crc kubenswrapper[4867]: I1006 13:54:19.863783 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dgqrd"] Oct 06 13:54:19 crc kubenswrapper[4867]: I1006 13:54:19.867456 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:19 crc kubenswrapper[4867]: I1006 13:54:19.884114 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgqrd"] Oct 06 13:54:19 crc kubenswrapper[4867]: I1006 13:54:19.972881 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f669b2d6-bf4b-4fae-abc1-d2193718db61-utilities\") pod \"redhat-marketplace-dgqrd\" (UID: \"f669b2d6-bf4b-4fae-abc1-d2193718db61\") " pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:19 crc kubenswrapper[4867]: I1006 13:54:19.973534 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f669b2d6-bf4b-4fae-abc1-d2193718db61-catalog-content\") pod \"redhat-marketplace-dgqrd\" (UID: \"f669b2d6-bf4b-4fae-abc1-d2193718db61\") " pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:19 crc kubenswrapper[4867]: I1006 13:54:19.973994 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6nbg\" (UniqueName: \"kubernetes.io/projected/f669b2d6-bf4b-4fae-abc1-d2193718db61-kube-api-access-q6nbg\") pod \"redhat-marketplace-dgqrd\" (UID: \"f669b2d6-bf4b-4fae-abc1-d2193718db61\") " pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:20 crc kubenswrapper[4867]: I1006 13:54:20.077892 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f669b2d6-bf4b-4fae-abc1-d2193718db61-catalog-content\") pod \"redhat-marketplace-dgqrd\" (UID: \"f669b2d6-bf4b-4fae-abc1-d2193718db61\") " pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:20 crc kubenswrapper[4867]: I1006 13:54:20.078264 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6nbg\" (UniqueName: \"kubernetes.io/projected/f669b2d6-bf4b-4fae-abc1-d2193718db61-kube-api-access-q6nbg\") pod \"redhat-marketplace-dgqrd\" (UID: \"f669b2d6-bf4b-4fae-abc1-d2193718db61\") " pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:20 crc kubenswrapper[4867]: I1006 13:54:20.078354 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f669b2d6-bf4b-4fae-abc1-d2193718db61-utilities\") pod \"redhat-marketplace-dgqrd\" (UID: \"f669b2d6-bf4b-4fae-abc1-d2193718db61\") " pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:20 crc kubenswrapper[4867]: I1006 13:54:20.078815 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f669b2d6-bf4b-4fae-abc1-d2193718db61-catalog-content\") pod \"redhat-marketplace-dgqrd\" (UID: \"f669b2d6-bf4b-4fae-abc1-d2193718db61\") " pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:20 crc kubenswrapper[4867]: I1006 13:54:20.079131 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f669b2d6-bf4b-4fae-abc1-d2193718db61-utilities\") pod \"redhat-marketplace-dgqrd\" (UID: \"f669b2d6-bf4b-4fae-abc1-d2193718db61\") " pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:20 crc kubenswrapper[4867]: I1006 13:54:20.103395 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6nbg\" (UniqueName: \"kubernetes.io/projected/f669b2d6-bf4b-4fae-abc1-d2193718db61-kube-api-access-q6nbg\") pod \"redhat-marketplace-dgqrd\" (UID: \"f669b2d6-bf4b-4fae-abc1-d2193718db61\") " pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:20 crc kubenswrapper[4867]: I1006 13:54:20.206007 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:20 crc kubenswrapper[4867]: I1006 13:54:20.751379 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgqrd"] Oct 06 13:54:21 crc kubenswrapper[4867]: I1006 13:54:21.177237 4867 generic.go:334] "Generic (PLEG): container finished" podID="f669b2d6-bf4b-4fae-abc1-d2193718db61" containerID="1db3fec5e2906aaf901703ea646cc942f2ad33f8e77369bdc96c44e253acb9e7" exitCode=0 Oct 06 13:54:21 crc kubenswrapper[4867]: I1006 13:54:21.177829 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgqrd" event={"ID":"f669b2d6-bf4b-4fae-abc1-d2193718db61","Type":"ContainerDied","Data":"1db3fec5e2906aaf901703ea646cc942f2ad33f8e77369bdc96c44e253acb9e7"} Oct 06 13:54:21 crc kubenswrapper[4867]: I1006 13:54:21.177881 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgqrd" event={"ID":"f669b2d6-bf4b-4fae-abc1-d2193718db61","Type":"ContainerStarted","Data":"cc3c02a0b8e870bb45b245e7d2832b707ec929ecaa1f3385e96460c45e3ba2a7"} Oct 06 13:54:23 crc kubenswrapper[4867]: I1006 13:54:23.197385 4867 generic.go:334] "Generic (PLEG): container finished" podID="f669b2d6-bf4b-4fae-abc1-d2193718db61" containerID="c7a85b9e074ec6f5b61bb190c19a05e2716ec6e1e2fe1fc34cc4b3d5bc53ed2e" exitCode=0 Oct 06 13:54:23 crc kubenswrapper[4867]: I1006 13:54:23.197472 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgqrd" event={"ID":"f669b2d6-bf4b-4fae-abc1-d2193718db61","Type":"ContainerDied","Data":"c7a85b9e074ec6f5b61bb190c19a05e2716ec6e1e2fe1fc34cc4b3d5bc53ed2e"} Oct 06 13:54:23 crc kubenswrapper[4867]: I1006 13:54:23.202486 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:54:24 crc kubenswrapper[4867]: I1006 13:54:24.215507 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgqrd" event={"ID":"f669b2d6-bf4b-4fae-abc1-d2193718db61","Type":"ContainerStarted","Data":"7b570c9e4ec8788e80d8fe314e0ed73ae24b1fa0993213cf27a2b18c4459be5e"} Oct 06 13:54:24 crc kubenswrapper[4867]: I1006 13:54:24.252594 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dgqrd" podStartSLOduration=2.801871025 podStartE2EDuration="5.25255813s" podCreationTimestamp="2025-10-06 13:54:19 +0000 UTC" firstStartedPulling="2025-10-06 13:54:21.18039817 +0000 UTC m=+3040.638346324" lastFinishedPulling="2025-10-06 13:54:23.631085245 +0000 UTC m=+3043.089033429" observedRunningTime="2025-10-06 13:54:24.23964641 +0000 UTC m=+3043.697594544" watchObservedRunningTime="2025-10-06 13:54:24.25255813 +0000 UTC m=+3043.710506274" Oct 06 13:54:30 crc kubenswrapper[4867]: I1006 13:54:30.207552 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:30 crc kubenswrapper[4867]: I1006 13:54:30.208083 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:30 crc kubenswrapper[4867]: I1006 13:54:30.264601 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:30 crc kubenswrapper[4867]: I1006 13:54:30.346069 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:30 crc kubenswrapper[4867]: I1006 13:54:30.507587 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgqrd"] Oct 06 13:54:32 crc kubenswrapper[4867]: I1006 13:54:32.307216 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dgqrd" podUID="f669b2d6-bf4b-4fae-abc1-d2193718db61" containerName="registry-server" containerID="cri-o://7b570c9e4ec8788e80d8fe314e0ed73ae24b1fa0993213cf27a2b18c4459be5e" gracePeriod=2 Oct 06 13:54:32 crc kubenswrapper[4867]: I1006 13:54:32.896213 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.068220 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6nbg\" (UniqueName: \"kubernetes.io/projected/f669b2d6-bf4b-4fae-abc1-d2193718db61-kube-api-access-q6nbg\") pod \"f669b2d6-bf4b-4fae-abc1-d2193718db61\" (UID: \"f669b2d6-bf4b-4fae-abc1-d2193718db61\") " Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.068432 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f669b2d6-bf4b-4fae-abc1-d2193718db61-utilities\") pod \"f669b2d6-bf4b-4fae-abc1-d2193718db61\" (UID: \"f669b2d6-bf4b-4fae-abc1-d2193718db61\") " Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.068754 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f669b2d6-bf4b-4fae-abc1-d2193718db61-catalog-content\") pod \"f669b2d6-bf4b-4fae-abc1-d2193718db61\" (UID: \"f669b2d6-bf4b-4fae-abc1-d2193718db61\") " Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.069814 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f669b2d6-bf4b-4fae-abc1-d2193718db61-utilities" (OuterVolumeSpecName: "utilities") pod "f669b2d6-bf4b-4fae-abc1-d2193718db61" (UID: "f669b2d6-bf4b-4fae-abc1-d2193718db61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.087752 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f669b2d6-bf4b-4fae-abc1-d2193718db61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f669b2d6-bf4b-4fae-abc1-d2193718db61" (UID: "f669b2d6-bf4b-4fae-abc1-d2193718db61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.088272 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f669b2d6-bf4b-4fae-abc1-d2193718db61-kube-api-access-q6nbg" (OuterVolumeSpecName: "kube-api-access-q6nbg") pod "f669b2d6-bf4b-4fae-abc1-d2193718db61" (UID: "f669b2d6-bf4b-4fae-abc1-d2193718db61"). InnerVolumeSpecName "kube-api-access-q6nbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.171477 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f669b2d6-bf4b-4fae-abc1-d2193718db61-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.172592 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6nbg\" (UniqueName: \"kubernetes.io/projected/f669b2d6-bf4b-4fae-abc1-d2193718db61-kube-api-access-q6nbg\") on node \"crc\" DevicePath \"\"" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.172631 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f669b2d6-bf4b-4fae-abc1-d2193718db61-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.320992 4867 generic.go:334] "Generic (PLEG): container finished" podID="f669b2d6-bf4b-4fae-abc1-d2193718db61" containerID="7b570c9e4ec8788e80d8fe314e0ed73ae24b1fa0993213cf27a2b18c4459be5e" exitCode=0 Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.321087 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgqrd" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.321088 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgqrd" event={"ID":"f669b2d6-bf4b-4fae-abc1-d2193718db61","Type":"ContainerDied","Data":"7b570c9e4ec8788e80d8fe314e0ed73ae24b1fa0993213cf27a2b18c4459be5e"} Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.321142 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgqrd" event={"ID":"f669b2d6-bf4b-4fae-abc1-d2193718db61","Type":"ContainerDied","Data":"cc3c02a0b8e870bb45b245e7d2832b707ec929ecaa1f3385e96460c45e3ba2a7"} Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.321169 4867 scope.go:117] "RemoveContainer" containerID="7b570c9e4ec8788e80d8fe314e0ed73ae24b1fa0993213cf27a2b18c4459be5e" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.353500 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgqrd"] Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.354736 4867 scope.go:117] "RemoveContainer" containerID="c7a85b9e074ec6f5b61bb190c19a05e2716ec6e1e2fe1fc34cc4b3d5bc53ed2e" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.361336 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgqrd"] Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.388490 4867 scope.go:117] "RemoveContainer" containerID="1db3fec5e2906aaf901703ea646cc942f2ad33f8e77369bdc96c44e253acb9e7" Oct 06 13:54:33 crc kubenswrapper[4867]: E1006 13:54:33.400906 4867 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf669b2d6_bf4b_4fae_abc1_d2193718db61.slice/crio-cc3c02a0b8e870bb45b245e7d2832b707ec929ecaa1f3385e96460c45e3ba2a7\": RecentStats: unable to find data in memory cache]" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.428398 4867 scope.go:117] "RemoveContainer" containerID="7b570c9e4ec8788e80d8fe314e0ed73ae24b1fa0993213cf27a2b18c4459be5e" Oct 06 13:54:33 crc kubenswrapper[4867]: E1006 13:54:33.428808 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b570c9e4ec8788e80d8fe314e0ed73ae24b1fa0993213cf27a2b18c4459be5e\": container with ID starting with 7b570c9e4ec8788e80d8fe314e0ed73ae24b1fa0993213cf27a2b18c4459be5e not found: ID does not exist" containerID="7b570c9e4ec8788e80d8fe314e0ed73ae24b1fa0993213cf27a2b18c4459be5e" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.428855 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b570c9e4ec8788e80d8fe314e0ed73ae24b1fa0993213cf27a2b18c4459be5e"} err="failed to get container status \"7b570c9e4ec8788e80d8fe314e0ed73ae24b1fa0993213cf27a2b18c4459be5e\": rpc error: code = NotFound desc = could not find container \"7b570c9e4ec8788e80d8fe314e0ed73ae24b1fa0993213cf27a2b18c4459be5e\": container with ID starting with 7b570c9e4ec8788e80d8fe314e0ed73ae24b1fa0993213cf27a2b18c4459be5e not found: ID does not exist" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.428884 4867 scope.go:117] "RemoveContainer" containerID="c7a85b9e074ec6f5b61bb190c19a05e2716ec6e1e2fe1fc34cc4b3d5bc53ed2e" Oct 06 13:54:33 crc kubenswrapper[4867]: E1006 13:54:33.429538 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7a85b9e074ec6f5b61bb190c19a05e2716ec6e1e2fe1fc34cc4b3d5bc53ed2e\": container with ID starting with c7a85b9e074ec6f5b61bb190c19a05e2716ec6e1e2fe1fc34cc4b3d5bc53ed2e not found: ID does not exist" containerID="c7a85b9e074ec6f5b61bb190c19a05e2716ec6e1e2fe1fc34cc4b3d5bc53ed2e" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.429597 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7a85b9e074ec6f5b61bb190c19a05e2716ec6e1e2fe1fc34cc4b3d5bc53ed2e"} err="failed to get container status \"c7a85b9e074ec6f5b61bb190c19a05e2716ec6e1e2fe1fc34cc4b3d5bc53ed2e\": rpc error: code = NotFound desc = could not find container \"c7a85b9e074ec6f5b61bb190c19a05e2716ec6e1e2fe1fc34cc4b3d5bc53ed2e\": container with ID starting with c7a85b9e074ec6f5b61bb190c19a05e2716ec6e1e2fe1fc34cc4b3d5bc53ed2e not found: ID does not exist" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.429639 4867 scope.go:117] "RemoveContainer" containerID="1db3fec5e2906aaf901703ea646cc942f2ad33f8e77369bdc96c44e253acb9e7" Oct 06 13:54:33 crc kubenswrapper[4867]: E1006 13:54:33.430063 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1db3fec5e2906aaf901703ea646cc942f2ad33f8e77369bdc96c44e253acb9e7\": container with ID starting with 1db3fec5e2906aaf901703ea646cc942f2ad33f8e77369bdc96c44e253acb9e7 not found: ID does not exist" containerID="1db3fec5e2906aaf901703ea646cc942f2ad33f8e77369bdc96c44e253acb9e7" Oct 06 13:54:33 crc kubenswrapper[4867]: I1006 13:54:33.430098 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db3fec5e2906aaf901703ea646cc942f2ad33f8e77369bdc96c44e253acb9e7"} err="failed to get container status \"1db3fec5e2906aaf901703ea646cc942f2ad33f8e77369bdc96c44e253acb9e7\": rpc error: code = NotFound desc = could not find container \"1db3fec5e2906aaf901703ea646cc942f2ad33f8e77369bdc96c44e253acb9e7\": container with ID starting with 1db3fec5e2906aaf901703ea646cc942f2ad33f8e77369bdc96c44e253acb9e7 not found: ID does not exist" Oct 06 13:54:35 crc kubenswrapper[4867]: I1006 13:54:35.240505 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f669b2d6-bf4b-4fae-abc1-d2193718db61" path="/var/lib/kubelet/pods/f669b2d6-bf4b-4fae-abc1-d2193718db61/volumes" Oct 06 13:54:42 crc kubenswrapper[4867]: I1006 13:54:42.873201 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:54:42 crc kubenswrapper[4867]: I1006 13:54:42.874339 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:55:12 crc kubenswrapper[4867]: I1006 13:55:12.873932 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:55:12 crc kubenswrapper[4867]: I1006 13:55:12.874739 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:55:42 crc kubenswrapper[4867]: I1006 13:55:42.874396 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:55:42 crc kubenswrapper[4867]: I1006 13:55:42.875404 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:55:42 crc kubenswrapper[4867]: I1006 13:55:42.875492 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 13:55:42 crc kubenswrapper[4867]: I1006 13:55:42.877083 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:55:42 crc kubenswrapper[4867]: I1006 13:55:42.877170 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" gracePeriod=600 Oct 06 13:55:43 crc kubenswrapper[4867]: E1006 13:55:43.007062 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:55:43 crc kubenswrapper[4867]: I1006 13:55:43.147674 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" exitCode=0 Oct 06 13:55:43 crc kubenswrapper[4867]: I1006 13:55:43.147729 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522"} Oct 06 13:55:43 crc kubenswrapper[4867]: I1006 13:55:43.147765 4867 scope.go:117] "RemoveContainer" containerID="6a57890df50f4067aad7fbbb27b3b2f13fa6a4de238cef56d72bff88beae742d" Oct 06 13:55:43 crc kubenswrapper[4867]: I1006 13:55:43.149139 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:55:43 crc kubenswrapper[4867]: E1006 13:55:43.149767 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:55:57 crc kubenswrapper[4867]: I1006 13:55:57.221372 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:55:57 crc kubenswrapper[4867]: E1006 13:55:57.222522 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:56:10 crc kubenswrapper[4867]: I1006 13:56:10.222114 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:56:10 crc kubenswrapper[4867]: E1006 13:56:10.223181 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:56:23 crc kubenswrapper[4867]: I1006 13:56:23.221979 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:56:23 crc kubenswrapper[4867]: E1006 13:56:23.223511 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:56:37 crc kubenswrapper[4867]: I1006 13:56:37.221846 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:56:37 crc kubenswrapper[4867]: E1006 13:56:37.222952 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:56:49 crc kubenswrapper[4867]: I1006 13:56:49.221766 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:56:49 crc kubenswrapper[4867]: E1006 13:56:49.223376 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:57:03 crc kubenswrapper[4867]: I1006 13:57:03.222493 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:57:03 crc kubenswrapper[4867]: E1006 13:57:03.223489 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:57:17 crc kubenswrapper[4867]: I1006 13:57:17.222231 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:57:17 crc kubenswrapper[4867]: E1006 13:57:17.223845 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:57:28 crc kubenswrapper[4867]: I1006 13:57:28.221633 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:57:28 crc kubenswrapper[4867]: E1006 13:57:28.223670 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:57:41 crc kubenswrapper[4867]: I1006 13:57:41.228988 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:57:41 crc kubenswrapper[4867]: E1006 13:57:41.230014 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:57:56 crc kubenswrapper[4867]: I1006 13:57:56.221059 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:57:56 crc kubenswrapper[4867]: E1006 13:57:56.221802 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:58:08 crc kubenswrapper[4867]: I1006 13:58:08.220873 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:58:08 crc kubenswrapper[4867]: E1006 13:58:08.221528 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:58:22 crc kubenswrapper[4867]: I1006 13:58:22.221978 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:58:22 crc kubenswrapper[4867]: E1006 13:58:22.222975 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:58:34 crc kubenswrapper[4867]: I1006 13:58:34.221601 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:58:34 crc kubenswrapper[4867]: E1006 13:58:34.222653 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.060638 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8pfv7"] Oct 06 13:58:43 crc kubenswrapper[4867]: E1006 13:58:43.061868 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f669b2d6-bf4b-4fae-abc1-d2193718db61" containerName="registry-server" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.061885 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f669b2d6-bf4b-4fae-abc1-d2193718db61" containerName="registry-server" Oct 06 13:58:43 crc kubenswrapper[4867]: E1006 13:58:43.061910 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f669b2d6-bf4b-4fae-abc1-d2193718db61" containerName="extract-content" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.061916 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f669b2d6-bf4b-4fae-abc1-d2193718db61" containerName="extract-content" Oct 06 13:58:43 crc kubenswrapper[4867]: E1006 13:58:43.061927 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f669b2d6-bf4b-4fae-abc1-d2193718db61" containerName="extract-utilities" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.061935 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f669b2d6-bf4b-4fae-abc1-d2193718db61" containerName="extract-utilities" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.062134 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f669b2d6-bf4b-4fae-abc1-d2193718db61" containerName="registry-server" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.064122 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.076994 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8pfv7"] Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.170126 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168e0db1-ce4a-41f6-be38-e10512b8337c-utilities\") pod \"certified-operators-8pfv7\" (UID: \"168e0db1-ce4a-41f6-be38-e10512b8337c\") " pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.170232 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168e0db1-ce4a-41f6-be38-e10512b8337c-catalog-content\") pod \"certified-operators-8pfv7\" (UID: \"168e0db1-ce4a-41f6-be38-e10512b8337c\") " pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.170332 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptz92\" (UniqueName: \"kubernetes.io/projected/168e0db1-ce4a-41f6-be38-e10512b8337c-kube-api-access-ptz92\") pod \"certified-operators-8pfv7\" (UID: \"168e0db1-ce4a-41f6-be38-e10512b8337c\") " pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.272938 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168e0db1-ce4a-41f6-be38-e10512b8337c-utilities\") pod \"certified-operators-8pfv7\" (UID: \"168e0db1-ce4a-41f6-be38-e10512b8337c\") " pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.274006 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168e0db1-ce4a-41f6-be38-e10512b8337c-catalog-content\") pod \"certified-operators-8pfv7\" (UID: \"168e0db1-ce4a-41f6-be38-e10512b8337c\") " pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.274158 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptz92\" (UniqueName: \"kubernetes.io/projected/168e0db1-ce4a-41f6-be38-e10512b8337c-kube-api-access-ptz92\") pod \"certified-operators-8pfv7\" (UID: \"168e0db1-ce4a-41f6-be38-e10512b8337c\") " pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.273815 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168e0db1-ce4a-41f6-be38-e10512b8337c-utilities\") pod \"certified-operators-8pfv7\" (UID: \"168e0db1-ce4a-41f6-be38-e10512b8337c\") " pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.274505 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168e0db1-ce4a-41f6-be38-e10512b8337c-catalog-content\") pod \"certified-operators-8pfv7\" (UID: \"168e0db1-ce4a-41f6-be38-e10512b8337c\") " pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.311464 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptz92\" (UniqueName: \"kubernetes.io/projected/168e0db1-ce4a-41f6-be38-e10512b8337c-kube-api-access-ptz92\") pod \"certified-operators-8pfv7\" (UID: \"168e0db1-ce4a-41f6-be38-e10512b8337c\") " pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.389035 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:43 crc kubenswrapper[4867]: I1006 13:58:43.970168 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8pfv7"] Oct 06 13:58:44 crc kubenswrapper[4867]: I1006 13:58:44.206515 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pfv7" event={"ID":"168e0db1-ce4a-41f6-be38-e10512b8337c","Type":"ContainerStarted","Data":"80e0f294357db7ce7bfab2ecd14057a73c392d860398881e778042b9f21138cd"} Oct 06 13:58:45 crc kubenswrapper[4867]: I1006 13:58:45.219310 4867 generic.go:334] "Generic (PLEG): container finished" podID="168e0db1-ce4a-41f6-be38-e10512b8337c" containerID="4ae6120f4a4111bb300ce2a58e2b8626c65c3158244ff57720a1e3a72c240be6" exitCode=0 Oct 06 13:58:45 crc kubenswrapper[4867]: I1006 13:58:45.219397 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pfv7" event={"ID":"168e0db1-ce4a-41f6-be38-e10512b8337c","Type":"ContainerDied","Data":"4ae6120f4a4111bb300ce2a58e2b8626c65c3158244ff57720a1e3a72c240be6"} Oct 06 13:58:47 crc kubenswrapper[4867]: I1006 13:58:47.450969 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tg2p2"] Oct 06 13:58:47 crc kubenswrapper[4867]: I1006 13:58:47.454804 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:58:47 crc kubenswrapper[4867]: I1006 13:58:47.465920 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tg2p2"] Oct 06 13:58:47 crc kubenswrapper[4867]: I1006 13:58:47.601414 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277689bf-694b-4880-a3b1-de3ff45480b5-catalog-content\") pod \"redhat-operators-tg2p2\" (UID: \"277689bf-694b-4880-a3b1-de3ff45480b5\") " pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:58:47 crc kubenswrapper[4867]: I1006 13:58:47.601703 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277689bf-694b-4880-a3b1-de3ff45480b5-utilities\") pod \"redhat-operators-tg2p2\" (UID: \"277689bf-694b-4880-a3b1-de3ff45480b5\") " pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:58:47 crc kubenswrapper[4867]: I1006 13:58:47.601813 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv2br\" (UniqueName: \"kubernetes.io/projected/277689bf-694b-4880-a3b1-de3ff45480b5-kube-api-access-zv2br\") pod \"redhat-operators-tg2p2\" (UID: \"277689bf-694b-4880-a3b1-de3ff45480b5\") " pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:58:47 crc kubenswrapper[4867]: I1006 13:58:47.705002 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277689bf-694b-4880-a3b1-de3ff45480b5-catalog-content\") pod \"redhat-operators-tg2p2\" (UID: \"277689bf-694b-4880-a3b1-de3ff45480b5\") " pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:58:47 crc kubenswrapper[4867]: I1006 13:58:47.705581 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277689bf-694b-4880-a3b1-de3ff45480b5-utilities\") pod \"redhat-operators-tg2p2\" (UID: \"277689bf-694b-4880-a3b1-de3ff45480b5\") " pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:58:47 crc kubenswrapper[4867]: I1006 13:58:47.705700 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv2br\" (UniqueName: \"kubernetes.io/projected/277689bf-694b-4880-a3b1-de3ff45480b5-kube-api-access-zv2br\") pod \"redhat-operators-tg2p2\" (UID: \"277689bf-694b-4880-a3b1-de3ff45480b5\") " pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:58:47 crc kubenswrapper[4867]: I1006 13:58:47.705729 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277689bf-694b-4880-a3b1-de3ff45480b5-catalog-content\") pod \"redhat-operators-tg2p2\" (UID: \"277689bf-694b-4880-a3b1-de3ff45480b5\") " pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:58:47 crc kubenswrapper[4867]: I1006 13:58:47.706077 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277689bf-694b-4880-a3b1-de3ff45480b5-utilities\") pod \"redhat-operators-tg2p2\" (UID: \"277689bf-694b-4880-a3b1-de3ff45480b5\") " pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:58:47 crc kubenswrapper[4867]: I1006 13:58:47.736314 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv2br\" (UniqueName: \"kubernetes.io/projected/277689bf-694b-4880-a3b1-de3ff45480b5-kube-api-access-zv2br\") pod \"redhat-operators-tg2p2\" (UID: \"277689bf-694b-4880-a3b1-de3ff45480b5\") " pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:58:47 crc kubenswrapper[4867]: I1006 13:58:47.789332 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:58:48 crc kubenswrapper[4867]: I1006 13:58:48.262992 4867 generic.go:334] "Generic (PLEG): container finished" podID="168e0db1-ce4a-41f6-be38-e10512b8337c" containerID="1d2080dd83f456d8315e0cfa0d7b8251def4574e1942897ade1dc118bf7ecd32" exitCode=0 Oct 06 13:58:48 crc kubenswrapper[4867]: I1006 13:58:48.263147 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pfv7" event={"ID":"168e0db1-ce4a-41f6-be38-e10512b8337c","Type":"ContainerDied","Data":"1d2080dd83f456d8315e0cfa0d7b8251def4574e1942897ade1dc118bf7ecd32"} Oct 06 13:58:48 crc kubenswrapper[4867]: I1006 13:58:48.328271 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tg2p2"] Oct 06 13:58:49 crc kubenswrapper[4867]: I1006 13:58:49.221847 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:58:49 crc kubenswrapper[4867]: E1006 13:58:49.222677 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:58:49 crc kubenswrapper[4867]: I1006 13:58:49.277624 4867 generic.go:334] "Generic (PLEG): container finished" podID="277689bf-694b-4880-a3b1-de3ff45480b5" containerID="46dd4d69a342ba5d90c2347ffea90e40feed026eae042747421d5a0cda65d04e" exitCode=0 Oct 06 13:58:49 crc kubenswrapper[4867]: I1006 13:58:49.277678 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg2p2" event={"ID":"277689bf-694b-4880-a3b1-de3ff45480b5","Type":"ContainerDied","Data":"46dd4d69a342ba5d90c2347ffea90e40feed026eae042747421d5a0cda65d04e"} Oct 06 13:58:49 crc kubenswrapper[4867]: I1006 13:58:49.277796 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg2p2" event={"ID":"277689bf-694b-4880-a3b1-de3ff45480b5","Type":"ContainerStarted","Data":"41bf6c8b4e39bf006044a9c551b3293347e7996c309da8898f5d4490a594a7d1"} Oct 06 13:58:50 crc kubenswrapper[4867]: I1006 13:58:50.288656 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pfv7" event={"ID":"168e0db1-ce4a-41f6-be38-e10512b8337c","Type":"ContainerStarted","Data":"db8d42353d694638e820755a0ce44824695be32b0b2305b8ac2ba0b5f2eb8f04"} Oct 06 13:58:50 crc kubenswrapper[4867]: I1006 13:58:50.324977 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8pfv7" podStartSLOduration=3.517232249 podStartE2EDuration="7.324956432s" podCreationTimestamp="2025-10-06 13:58:43 +0000 UTC" firstStartedPulling="2025-10-06 13:58:45.22314739 +0000 UTC m=+3304.681095534" lastFinishedPulling="2025-10-06 13:58:49.030871573 +0000 UTC m=+3308.488819717" observedRunningTime="2025-10-06 13:58:50.320157732 +0000 UTC m=+3309.778105876" watchObservedRunningTime="2025-10-06 13:58:50.324956432 +0000 UTC m=+3309.782904576" Oct 06 13:58:51 crc kubenswrapper[4867]: I1006 13:58:51.305488 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg2p2" event={"ID":"277689bf-694b-4880-a3b1-de3ff45480b5","Type":"ContainerStarted","Data":"e2138e503a5ab4112f9f5bc9707d250ba92733a541278999ed68b07fd73de6e8"} Oct 06 13:58:52 crc kubenswrapper[4867]: I1006 13:58:52.318847 4867 generic.go:334] "Generic (PLEG): container finished" podID="277689bf-694b-4880-a3b1-de3ff45480b5" containerID="e2138e503a5ab4112f9f5bc9707d250ba92733a541278999ed68b07fd73de6e8" exitCode=0 Oct 06 13:58:52 crc kubenswrapper[4867]: I1006 13:58:52.318925 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg2p2" event={"ID":"277689bf-694b-4880-a3b1-de3ff45480b5","Type":"ContainerDied","Data":"e2138e503a5ab4112f9f5bc9707d250ba92733a541278999ed68b07fd73de6e8"} Oct 06 13:58:53 crc kubenswrapper[4867]: I1006 13:58:53.389995 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:53 crc kubenswrapper[4867]: I1006 13:58:53.390554 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:53 crc kubenswrapper[4867]: I1006 13:58:53.464430 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:54 crc kubenswrapper[4867]: I1006 13:58:54.349171 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg2p2" event={"ID":"277689bf-694b-4880-a3b1-de3ff45480b5","Type":"ContainerStarted","Data":"447ccc86b13799665994bf69d6ef0211d886b52cfb594e730438f55b690114f9"} Oct 06 13:58:54 crc kubenswrapper[4867]: I1006 13:58:54.381920 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tg2p2" podStartSLOduration=3.433596642 podStartE2EDuration="7.381894572s" podCreationTimestamp="2025-10-06 13:58:47 +0000 UTC" firstStartedPulling="2025-10-06 13:58:49.27971614 +0000 UTC m=+3308.737664284" lastFinishedPulling="2025-10-06 13:58:53.22801407 +0000 UTC m=+3312.685962214" observedRunningTime="2025-10-06 13:58:54.37152671 +0000 UTC m=+3313.829474934" watchObservedRunningTime="2025-10-06 13:58:54.381894572 +0000 UTC m=+3313.839842716" Oct 06 13:58:54 crc kubenswrapper[4867]: I1006 13:58:54.447104 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:55 crc kubenswrapper[4867]: I1006 13:58:55.651724 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8pfv7"] Oct 06 13:58:56 crc kubenswrapper[4867]: I1006 13:58:56.381093 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8pfv7" podUID="168e0db1-ce4a-41f6-be38-e10512b8337c" containerName="registry-server" containerID="cri-o://db8d42353d694638e820755a0ce44824695be32b0b2305b8ac2ba0b5f2eb8f04" gracePeriod=2 Oct 06 13:58:56 crc kubenswrapper[4867]: I1006 13:58:56.906202 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.040668 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168e0db1-ce4a-41f6-be38-e10512b8337c-utilities\") pod \"168e0db1-ce4a-41f6-be38-e10512b8337c\" (UID: \"168e0db1-ce4a-41f6-be38-e10512b8337c\") " Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.040797 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168e0db1-ce4a-41f6-be38-e10512b8337c-catalog-content\") pod \"168e0db1-ce4a-41f6-be38-e10512b8337c\" (UID: \"168e0db1-ce4a-41f6-be38-e10512b8337c\") " Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.040900 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptz92\" (UniqueName: \"kubernetes.io/projected/168e0db1-ce4a-41f6-be38-e10512b8337c-kube-api-access-ptz92\") pod \"168e0db1-ce4a-41f6-be38-e10512b8337c\" (UID: \"168e0db1-ce4a-41f6-be38-e10512b8337c\") " Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.041508 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168e0db1-ce4a-41f6-be38-e10512b8337c-utilities" (OuterVolumeSpecName: "utilities") pod "168e0db1-ce4a-41f6-be38-e10512b8337c" (UID: "168e0db1-ce4a-41f6-be38-e10512b8337c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.056873 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168e0db1-ce4a-41f6-be38-e10512b8337c-kube-api-access-ptz92" (OuterVolumeSpecName: "kube-api-access-ptz92") pod "168e0db1-ce4a-41f6-be38-e10512b8337c" (UID: "168e0db1-ce4a-41f6-be38-e10512b8337c"). InnerVolumeSpecName "kube-api-access-ptz92". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.089463 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168e0db1-ce4a-41f6-be38-e10512b8337c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "168e0db1-ce4a-41f6-be38-e10512b8337c" (UID: "168e0db1-ce4a-41f6-be38-e10512b8337c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.144013 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168e0db1-ce4a-41f6-be38-e10512b8337c-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.144070 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168e0db1-ce4a-41f6-be38-e10512b8337c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.144090 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptz92\" (UniqueName: \"kubernetes.io/projected/168e0db1-ce4a-41f6-be38-e10512b8337c-kube-api-access-ptz92\") on node \"crc\" DevicePath \"\"" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.398981 4867 generic.go:334] "Generic (PLEG): container finished" podID="168e0db1-ce4a-41f6-be38-e10512b8337c" containerID="db8d42353d694638e820755a0ce44824695be32b0b2305b8ac2ba0b5f2eb8f04" exitCode=0 Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.399061 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pfv7" event={"ID":"168e0db1-ce4a-41f6-be38-e10512b8337c","Type":"ContainerDied","Data":"db8d42353d694638e820755a0ce44824695be32b0b2305b8ac2ba0b5f2eb8f04"} Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.399868 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pfv7" event={"ID":"168e0db1-ce4a-41f6-be38-e10512b8337c","Type":"ContainerDied","Data":"80e0f294357db7ce7bfab2ecd14057a73c392d860398881e778042b9f21138cd"} Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.399105 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pfv7" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.399912 4867 scope.go:117] "RemoveContainer" containerID="db8d42353d694638e820755a0ce44824695be32b0b2305b8ac2ba0b5f2eb8f04" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.432999 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8pfv7"] Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.437537 4867 scope.go:117] "RemoveContainer" containerID="1d2080dd83f456d8315e0cfa0d7b8251def4574e1942897ade1dc118bf7ecd32" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.442594 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8pfv7"] Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.477211 4867 scope.go:117] "RemoveContainer" containerID="4ae6120f4a4111bb300ce2a58e2b8626c65c3158244ff57720a1e3a72c240be6" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.520749 4867 scope.go:117] "RemoveContainer" containerID="db8d42353d694638e820755a0ce44824695be32b0b2305b8ac2ba0b5f2eb8f04" Oct 06 13:58:57 crc kubenswrapper[4867]: E1006 13:58:57.521324 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db8d42353d694638e820755a0ce44824695be32b0b2305b8ac2ba0b5f2eb8f04\": container with ID starting with db8d42353d694638e820755a0ce44824695be32b0b2305b8ac2ba0b5f2eb8f04 not found: ID does not exist" containerID="db8d42353d694638e820755a0ce44824695be32b0b2305b8ac2ba0b5f2eb8f04" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.521372 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8d42353d694638e820755a0ce44824695be32b0b2305b8ac2ba0b5f2eb8f04"} err="failed to get container status \"db8d42353d694638e820755a0ce44824695be32b0b2305b8ac2ba0b5f2eb8f04\": rpc error: code = NotFound desc = could not find container \"db8d42353d694638e820755a0ce44824695be32b0b2305b8ac2ba0b5f2eb8f04\": container with ID starting with db8d42353d694638e820755a0ce44824695be32b0b2305b8ac2ba0b5f2eb8f04 not found: ID does not exist" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.521402 4867 scope.go:117] "RemoveContainer" containerID="1d2080dd83f456d8315e0cfa0d7b8251def4574e1942897ade1dc118bf7ecd32" Oct 06 13:58:57 crc kubenswrapper[4867]: E1006 13:58:57.521956 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d2080dd83f456d8315e0cfa0d7b8251def4574e1942897ade1dc118bf7ecd32\": container with ID starting with 1d2080dd83f456d8315e0cfa0d7b8251def4574e1942897ade1dc118bf7ecd32 not found: ID does not exist" containerID="1d2080dd83f456d8315e0cfa0d7b8251def4574e1942897ade1dc118bf7ecd32" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.521981 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d2080dd83f456d8315e0cfa0d7b8251def4574e1942897ade1dc118bf7ecd32"} err="failed to get container status \"1d2080dd83f456d8315e0cfa0d7b8251def4574e1942897ade1dc118bf7ecd32\": rpc error: code = NotFound desc = could not find container \"1d2080dd83f456d8315e0cfa0d7b8251def4574e1942897ade1dc118bf7ecd32\": container with ID starting with 1d2080dd83f456d8315e0cfa0d7b8251def4574e1942897ade1dc118bf7ecd32 not found: ID does not exist" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.521995 4867 scope.go:117] "RemoveContainer" containerID="4ae6120f4a4111bb300ce2a58e2b8626c65c3158244ff57720a1e3a72c240be6" Oct 06 13:58:57 crc kubenswrapper[4867]: E1006 13:58:57.522533 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae6120f4a4111bb300ce2a58e2b8626c65c3158244ff57720a1e3a72c240be6\": container with ID starting with 4ae6120f4a4111bb300ce2a58e2b8626c65c3158244ff57720a1e3a72c240be6 not found: ID does not exist" containerID="4ae6120f4a4111bb300ce2a58e2b8626c65c3158244ff57720a1e3a72c240be6" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.522594 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae6120f4a4111bb300ce2a58e2b8626c65c3158244ff57720a1e3a72c240be6"} err="failed to get container status \"4ae6120f4a4111bb300ce2a58e2b8626c65c3158244ff57720a1e3a72c240be6\": rpc error: code = NotFound desc = could not find container \"4ae6120f4a4111bb300ce2a58e2b8626c65c3158244ff57720a1e3a72c240be6\": container with ID starting with 4ae6120f4a4111bb300ce2a58e2b8626c65c3158244ff57720a1e3a72c240be6 not found: ID does not exist" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.789916 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:58:57 crc kubenswrapper[4867]: I1006 13:58:57.789986 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:58:58 crc kubenswrapper[4867]: I1006 13:58:58.845857 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tg2p2" podUID="277689bf-694b-4880-a3b1-de3ff45480b5" containerName="registry-server" probeResult="failure" output=< Oct 06 13:58:58 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Oct 06 13:58:58 crc kubenswrapper[4867]: > Oct 06 13:58:59 crc kubenswrapper[4867]: I1006 13:58:59.256010 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="168e0db1-ce4a-41f6-be38-e10512b8337c" path="/var/lib/kubelet/pods/168e0db1-ce4a-41f6-be38-e10512b8337c/volumes" Oct 06 13:59:03 crc kubenswrapper[4867]: I1006 13:59:03.225924 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:59:03 crc kubenswrapper[4867]: E1006 13:59:03.227429 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:59:07 crc kubenswrapper[4867]: I1006 13:59:07.839650 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:59:07 crc kubenswrapper[4867]: I1006 13:59:07.909009 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:59:08 crc kubenswrapper[4867]: I1006 13:59:08.085362 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tg2p2"] Oct 06 13:59:09 crc kubenswrapper[4867]: I1006 13:59:09.527859 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tg2p2" podUID="277689bf-694b-4880-a3b1-de3ff45480b5" containerName="registry-server" containerID="cri-o://447ccc86b13799665994bf69d6ef0211d886b52cfb594e730438f55b690114f9" gracePeriod=2 Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.073322 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.179410 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277689bf-694b-4880-a3b1-de3ff45480b5-catalog-content\") pod \"277689bf-694b-4880-a3b1-de3ff45480b5\" (UID: \"277689bf-694b-4880-a3b1-de3ff45480b5\") " Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.179851 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277689bf-694b-4880-a3b1-de3ff45480b5-utilities\") pod \"277689bf-694b-4880-a3b1-de3ff45480b5\" (UID: \"277689bf-694b-4880-a3b1-de3ff45480b5\") " Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.180161 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv2br\" (UniqueName: \"kubernetes.io/projected/277689bf-694b-4880-a3b1-de3ff45480b5-kube-api-access-zv2br\") pod \"277689bf-694b-4880-a3b1-de3ff45480b5\" (UID: \"277689bf-694b-4880-a3b1-de3ff45480b5\") " Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.180661 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/277689bf-694b-4880-a3b1-de3ff45480b5-utilities" (OuterVolumeSpecName: "utilities") pod "277689bf-694b-4880-a3b1-de3ff45480b5" (UID: "277689bf-694b-4880-a3b1-de3ff45480b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.181630 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277689bf-694b-4880-a3b1-de3ff45480b5-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.191348 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/277689bf-694b-4880-a3b1-de3ff45480b5-kube-api-access-zv2br" (OuterVolumeSpecName: "kube-api-access-zv2br") pod "277689bf-694b-4880-a3b1-de3ff45480b5" (UID: "277689bf-694b-4880-a3b1-de3ff45480b5"). InnerVolumeSpecName "kube-api-access-zv2br". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.276497 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/277689bf-694b-4880-a3b1-de3ff45480b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "277689bf-694b-4880-a3b1-de3ff45480b5" (UID: "277689bf-694b-4880-a3b1-de3ff45480b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.284509 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv2br\" (UniqueName: \"kubernetes.io/projected/277689bf-694b-4880-a3b1-de3ff45480b5-kube-api-access-zv2br\") on node \"crc\" DevicePath \"\"" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.284566 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277689bf-694b-4880-a3b1-de3ff45480b5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.546528 4867 generic.go:334] "Generic (PLEG): container finished" podID="277689bf-694b-4880-a3b1-de3ff45480b5" containerID="447ccc86b13799665994bf69d6ef0211d886b52cfb594e730438f55b690114f9" exitCode=0 Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.546612 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg2p2" event={"ID":"277689bf-694b-4880-a3b1-de3ff45480b5","Type":"ContainerDied","Data":"447ccc86b13799665994bf69d6ef0211d886b52cfb594e730438f55b690114f9"} Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.546657 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg2p2" event={"ID":"277689bf-694b-4880-a3b1-de3ff45480b5","Type":"ContainerDied","Data":"41bf6c8b4e39bf006044a9c551b3293347e7996c309da8898f5d4490a594a7d1"} Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.546683 4867 scope.go:117] "RemoveContainer" containerID="447ccc86b13799665994bf69d6ef0211d886b52cfb594e730438f55b690114f9" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.546686 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tg2p2" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.589324 4867 scope.go:117] "RemoveContainer" containerID="e2138e503a5ab4112f9f5bc9707d250ba92733a541278999ed68b07fd73de6e8" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.590569 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tg2p2"] Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.601581 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tg2p2"] Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.623146 4867 scope.go:117] "RemoveContainer" containerID="46dd4d69a342ba5d90c2347ffea90e40feed026eae042747421d5a0cda65d04e" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.667745 4867 scope.go:117] "RemoveContainer" containerID="447ccc86b13799665994bf69d6ef0211d886b52cfb594e730438f55b690114f9" Oct 06 13:59:10 crc kubenswrapper[4867]: E1006 13:59:10.668501 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"447ccc86b13799665994bf69d6ef0211d886b52cfb594e730438f55b690114f9\": container with ID starting with 447ccc86b13799665994bf69d6ef0211d886b52cfb594e730438f55b690114f9 not found: ID does not exist" containerID="447ccc86b13799665994bf69d6ef0211d886b52cfb594e730438f55b690114f9" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.668567 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"447ccc86b13799665994bf69d6ef0211d886b52cfb594e730438f55b690114f9"} err="failed to get container status \"447ccc86b13799665994bf69d6ef0211d886b52cfb594e730438f55b690114f9\": rpc error: code = NotFound desc = could not find container \"447ccc86b13799665994bf69d6ef0211d886b52cfb594e730438f55b690114f9\": container with ID starting with 447ccc86b13799665994bf69d6ef0211d886b52cfb594e730438f55b690114f9 not found: ID does not exist" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.668609 4867 scope.go:117] "RemoveContainer" containerID="e2138e503a5ab4112f9f5bc9707d250ba92733a541278999ed68b07fd73de6e8" Oct 06 13:59:10 crc kubenswrapper[4867]: E1006 13:59:10.669372 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2138e503a5ab4112f9f5bc9707d250ba92733a541278999ed68b07fd73de6e8\": container with ID starting with e2138e503a5ab4112f9f5bc9707d250ba92733a541278999ed68b07fd73de6e8 not found: ID does not exist" containerID="e2138e503a5ab4112f9f5bc9707d250ba92733a541278999ed68b07fd73de6e8" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.669419 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2138e503a5ab4112f9f5bc9707d250ba92733a541278999ed68b07fd73de6e8"} err="failed to get container status \"e2138e503a5ab4112f9f5bc9707d250ba92733a541278999ed68b07fd73de6e8\": rpc error: code = NotFound desc = could not find container \"e2138e503a5ab4112f9f5bc9707d250ba92733a541278999ed68b07fd73de6e8\": container with ID starting with e2138e503a5ab4112f9f5bc9707d250ba92733a541278999ed68b07fd73de6e8 not found: ID does not exist" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.669451 4867 scope.go:117] "RemoveContainer" containerID="46dd4d69a342ba5d90c2347ffea90e40feed026eae042747421d5a0cda65d04e" Oct 06 13:59:10 crc kubenswrapper[4867]: E1006 13:59:10.670028 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46dd4d69a342ba5d90c2347ffea90e40feed026eae042747421d5a0cda65d04e\": container with ID starting with 46dd4d69a342ba5d90c2347ffea90e40feed026eae042747421d5a0cda65d04e not found: ID does not exist" containerID="46dd4d69a342ba5d90c2347ffea90e40feed026eae042747421d5a0cda65d04e" Oct 06 13:59:10 crc kubenswrapper[4867]: I1006 13:59:10.670098 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46dd4d69a342ba5d90c2347ffea90e40feed026eae042747421d5a0cda65d04e"} err="failed to get container status \"46dd4d69a342ba5d90c2347ffea90e40feed026eae042747421d5a0cda65d04e\": rpc error: code = NotFound desc = could not find container \"46dd4d69a342ba5d90c2347ffea90e40feed026eae042747421d5a0cda65d04e\": container with ID starting with 46dd4d69a342ba5d90c2347ffea90e40feed026eae042747421d5a0cda65d04e not found: ID does not exist" Oct 06 13:59:11 crc kubenswrapper[4867]: I1006 13:59:11.239489 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="277689bf-694b-4880-a3b1-de3ff45480b5" path="/var/lib/kubelet/pods/277689bf-694b-4880-a3b1-de3ff45480b5/volumes" Oct 06 13:59:15 crc kubenswrapper[4867]: I1006 13:59:15.224278 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:59:15 crc kubenswrapper[4867]: E1006 13:59:15.225541 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:59:29 crc kubenswrapper[4867]: I1006 13:59:29.222905 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:59:29 crc kubenswrapper[4867]: E1006 13:59:29.224355 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:59:41 crc kubenswrapper[4867]: I1006 13:59:41.229513 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:59:41 crc kubenswrapper[4867]: E1006 13:59:41.230632 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 13:59:56 crc kubenswrapper[4867]: I1006 13:59:56.221326 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 13:59:56 crc kubenswrapper[4867]: E1006 13:59:56.222270 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.174159 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb"] Oct 06 14:00:00 crc kubenswrapper[4867]: E1006 14:00:00.175154 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168e0db1-ce4a-41f6-be38-e10512b8337c" containerName="extract-utilities" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.175172 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="168e0db1-ce4a-41f6-be38-e10512b8337c" containerName="extract-utilities" Oct 06 14:00:00 crc kubenswrapper[4867]: E1006 14:00:00.175187 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277689bf-694b-4880-a3b1-de3ff45480b5" containerName="extract-content" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.175198 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="277689bf-694b-4880-a3b1-de3ff45480b5" containerName="extract-content" Oct 06 14:00:00 crc kubenswrapper[4867]: E1006 14:00:00.175225 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277689bf-694b-4880-a3b1-de3ff45480b5" containerName="extract-utilities" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.175276 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="277689bf-694b-4880-a3b1-de3ff45480b5" containerName="extract-utilities" Oct 06 14:00:00 crc kubenswrapper[4867]: E1006 14:00:00.175300 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168e0db1-ce4a-41f6-be38-e10512b8337c" containerName="registry-server" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.175308 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="168e0db1-ce4a-41f6-be38-e10512b8337c" containerName="registry-server" Oct 06 14:00:00 crc kubenswrapper[4867]: E1006 14:00:00.175333 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168e0db1-ce4a-41f6-be38-e10512b8337c" containerName="extract-content" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.175341 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="168e0db1-ce4a-41f6-be38-e10512b8337c" containerName="extract-content" Oct 06 14:00:00 crc kubenswrapper[4867]: E1006 14:00:00.175359 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277689bf-694b-4880-a3b1-de3ff45480b5" containerName="registry-server" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.175366 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="277689bf-694b-4880-a3b1-de3ff45480b5" containerName="registry-server" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.175634 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="168e0db1-ce4a-41f6-be38-e10512b8337c" containerName="registry-server" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.175651 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="277689bf-694b-4880-a3b1-de3ff45480b5" containerName="registry-server" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.176591 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.179748 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.179837 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.192208 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb"] Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.300921 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-config-volume\") pod \"collect-profiles-29329320-25nxb\" (UID: \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.301578 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmdfv\" (UniqueName: \"kubernetes.io/projected/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-kube-api-access-fmdfv\") pod \"collect-profiles-29329320-25nxb\" (UID: \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.301630 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-secret-volume\") pod \"collect-profiles-29329320-25nxb\" (UID: \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.404010 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmdfv\" (UniqueName: \"kubernetes.io/projected/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-kube-api-access-fmdfv\") pod \"collect-profiles-29329320-25nxb\" (UID: \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.404150 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-secret-volume\") pod \"collect-profiles-29329320-25nxb\" (UID: \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.404208 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-config-volume\") pod \"collect-profiles-29329320-25nxb\" (UID: \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.405673 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-config-volume\") pod \"collect-profiles-29329320-25nxb\" (UID: \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.413201 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-secret-volume\") pod \"collect-profiles-29329320-25nxb\" (UID: \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.423045 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmdfv\" (UniqueName: \"kubernetes.io/projected/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-kube-api-access-fmdfv\") pod \"collect-profiles-29329320-25nxb\" (UID: \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.504639 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" Oct 06 14:00:00 crc kubenswrapper[4867]: I1006 14:00:00.948619 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb"] Oct 06 14:00:01 crc kubenswrapper[4867]: I1006 14:00:01.089145 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" event={"ID":"6ad9940f-df1b-44d5-8982-30b35e8d2d3d","Type":"ContainerStarted","Data":"0595d5bc08e710198fe0d913bce975943fe615f31d36be3b03466d7efd03b528"} Oct 06 14:00:02 crc kubenswrapper[4867]: I1006 14:00:02.102885 4867 generic.go:334] "Generic (PLEG): container finished" podID="6ad9940f-df1b-44d5-8982-30b35e8d2d3d" containerID="b50414276136ecd6e8be8aecf45ecd681137ddaed6545e5db169d8df44e744e4" exitCode=0 Oct 06 14:00:02 crc kubenswrapper[4867]: I1006 14:00:02.102978 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" event={"ID":"6ad9940f-df1b-44d5-8982-30b35e8d2d3d","Type":"ContainerDied","Data":"b50414276136ecd6e8be8aecf45ecd681137ddaed6545e5db169d8df44e744e4"} Oct 06 14:00:03 crc kubenswrapper[4867]: I1006 14:00:03.487235 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" Oct 06 14:00:03 crc kubenswrapper[4867]: I1006 14:00:03.582459 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-config-volume\") pod \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\" (UID: \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\") " Oct 06 14:00:03 crc kubenswrapper[4867]: I1006 14:00:03.582672 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmdfv\" (UniqueName: \"kubernetes.io/projected/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-kube-api-access-fmdfv\") pod \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\" (UID: \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\") " Oct 06 14:00:03 crc kubenswrapper[4867]: I1006 14:00:03.582737 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-secret-volume\") pod \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\" (UID: \"6ad9940f-df1b-44d5-8982-30b35e8d2d3d\") " Oct 06 14:00:03 crc kubenswrapper[4867]: I1006 14:00:03.583432 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-config-volume" (OuterVolumeSpecName: "config-volume") pod "6ad9940f-df1b-44d5-8982-30b35e8d2d3d" (UID: "6ad9940f-df1b-44d5-8982-30b35e8d2d3d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:00:03 crc kubenswrapper[4867]: I1006 14:00:03.584401 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 14:00:03 crc kubenswrapper[4867]: I1006 14:00:03.588565 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-kube-api-access-fmdfv" (OuterVolumeSpecName: "kube-api-access-fmdfv") pod "6ad9940f-df1b-44d5-8982-30b35e8d2d3d" (UID: "6ad9940f-df1b-44d5-8982-30b35e8d2d3d"). InnerVolumeSpecName "kube-api-access-fmdfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:00:03 crc kubenswrapper[4867]: I1006 14:00:03.588780 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6ad9940f-df1b-44d5-8982-30b35e8d2d3d" (UID: "6ad9940f-df1b-44d5-8982-30b35e8d2d3d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:00:03 crc kubenswrapper[4867]: I1006 14:00:03.686607 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmdfv\" (UniqueName: \"kubernetes.io/projected/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-kube-api-access-fmdfv\") on node \"crc\" DevicePath \"\"" Oct 06 14:00:03 crc kubenswrapper[4867]: I1006 14:00:03.686650 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ad9940f-df1b-44d5-8982-30b35e8d2d3d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 14:00:04 crc kubenswrapper[4867]: I1006 14:00:04.125509 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" event={"ID":"6ad9940f-df1b-44d5-8982-30b35e8d2d3d","Type":"ContainerDied","Data":"0595d5bc08e710198fe0d913bce975943fe615f31d36be3b03466d7efd03b528"} Oct 06 14:00:04 crc kubenswrapper[4867]: I1006 14:00:04.125913 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0595d5bc08e710198fe0d913bce975943fe615f31d36be3b03466d7efd03b528" Oct 06 14:00:04 crc kubenswrapper[4867]: I1006 14:00:04.125743 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb" Oct 06 14:00:04 crc kubenswrapper[4867]: I1006 14:00:04.590114 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh"] Oct 06 14:00:04 crc kubenswrapper[4867]: I1006 14:00:04.603903 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329275-bp6nh"] Oct 06 14:00:05 crc kubenswrapper[4867]: I1006 14:00:05.238864 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8a8b845-d9b0-4111-b9f2-01d31c27fefe" path="/var/lib/kubelet/pods/b8a8b845-d9b0-4111-b9f2-01d31c27fefe/volumes" Oct 06 14:00:07 crc kubenswrapper[4867]: I1006 14:00:07.221488 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 14:00:07 crc kubenswrapper[4867]: E1006 14:00:07.222126 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:00:21 crc kubenswrapper[4867]: I1006 14:00:21.221700 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 14:00:21 crc kubenswrapper[4867]: E1006 14:00:21.223199 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:00:34 crc kubenswrapper[4867]: I1006 14:00:34.221859 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 14:00:34 crc kubenswrapper[4867]: E1006 14:00:34.222765 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.200810 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rf57b"] Oct 06 14:00:39 crc kubenswrapper[4867]: E1006 14:00:39.202136 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ad9940f-df1b-44d5-8982-30b35e8d2d3d" containerName="collect-profiles" Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.202158 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad9940f-df1b-44d5-8982-30b35e8d2d3d" containerName="collect-profiles" Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.202470 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ad9940f-df1b-44d5-8982-30b35e8d2d3d" containerName="collect-profiles" Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.204573 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.213357 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rf57b"] Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.267107 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-catalog-content\") pod \"community-operators-rf57b\" (UID: \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\") " pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.267165 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-utilities\") pod \"community-operators-rf57b\" (UID: \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\") " pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.267761 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7gqg\" (UniqueName: \"kubernetes.io/projected/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-kube-api-access-f7gqg\") pod \"community-operators-rf57b\" (UID: \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\") " pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.370142 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7gqg\" (UniqueName: \"kubernetes.io/projected/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-kube-api-access-f7gqg\") pod \"community-operators-rf57b\" (UID: \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\") " pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.370259 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-catalog-content\") pod \"community-operators-rf57b\" (UID: \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\") " pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.370286 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-utilities\") pod \"community-operators-rf57b\" (UID: \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\") " pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.370873 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-catalog-content\") pod \"community-operators-rf57b\" (UID: \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\") " pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.370955 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-utilities\") pod \"community-operators-rf57b\" (UID: \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\") " pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.400767 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7gqg\" (UniqueName: \"kubernetes.io/projected/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-kube-api-access-f7gqg\") pod \"community-operators-rf57b\" (UID: \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\") " pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:39 crc kubenswrapper[4867]: I1006 14:00:39.544009 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:40 crc kubenswrapper[4867]: I1006 14:00:40.068219 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rf57b"] Oct 06 14:00:40 crc kubenswrapper[4867]: I1006 14:00:40.517665 4867 generic.go:334] "Generic (PLEG): container finished" podID="000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" containerID="f5df96a0966f86eeff3690aa7a5d0d618cea662904207ac59669aa4746f5bb65" exitCode=0 Oct 06 14:00:40 crc kubenswrapper[4867]: I1006 14:00:40.517775 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rf57b" event={"ID":"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96","Type":"ContainerDied","Data":"f5df96a0966f86eeff3690aa7a5d0d618cea662904207ac59669aa4746f5bb65"} Oct 06 14:00:40 crc kubenswrapper[4867]: I1006 14:00:40.519233 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rf57b" event={"ID":"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96","Type":"ContainerStarted","Data":"94dea1b83846da9bfd9ceb95761abc1a6a663adf7ff39fb2391092ecc6c15550"} Oct 06 14:00:40 crc kubenswrapper[4867]: I1006 14:00:40.521202 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 14:00:42 crc kubenswrapper[4867]: I1006 14:00:42.541487 4867 generic.go:334] "Generic (PLEG): container finished" podID="000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" containerID="6aac06163cdef360f5bdd8097dc1338f53d1dfd21c318f615555b84598006095" exitCode=0 Oct 06 14:00:42 crc kubenswrapper[4867]: I1006 14:00:42.541586 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rf57b" event={"ID":"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96","Type":"ContainerDied","Data":"6aac06163cdef360f5bdd8097dc1338f53d1dfd21c318f615555b84598006095"} Oct 06 14:00:43 crc kubenswrapper[4867]: I1006 14:00:43.554619 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rf57b" event={"ID":"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96","Type":"ContainerStarted","Data":"f3a2852a846def6b2803e497c8b23aa0e40c62118b530c186d6cd79997e0a145"} Oct 06 14:00:43 crc kubenswrapper[4867]: I1006 14:00:43.583613 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rf57b" podStartSLOduration=1.973693763 podStartE2EDuration="4.583592518s" podCreationTimestamp="2025-10-06 14:00:39 +0000 UTC" firstStartedPulling="2025-10-06 14:00:40.5209247 +0000 UTC m=+3419.978872844" lastFinishedPulling="2025-10-06 14:00:43.130823435 +0000 UTC m=+3422.588771599" observedRunningTime="2025-10-06 14:00:43.576917686 +0000 UTC m=+3423.034865830" watchObservedRunningTime="2025-10-06 14:00:43.583592518 +0000 UTC m=+3423.041540662" Oct 06 14:00:45 crc kubenswrapper[4867]: I1006 14:00:45.221468 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 14:00:45 crc kubenswrapper[4867]: I1006 14:00:45.580598 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"882abd54c7802eefd854cdc2aeafb2f94092441b75c4e4174c09ec896ef711cd"} Oct 06 14:00:49 crc kubenswrapper[4867]: I1006 14:00:49.544369 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:49 crc kubenswrapper[4867]: I1006 14:00:49.545348 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:49 crc kubenswrapper[4867]: I1006 14:00:49.619390 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:49 crc kubenswrapper[4867]: I1006 14:00:49.676150 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:49 crc kubenswrapper[4867]: I1006 14:00:49.866596 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rf57b"] Oct 06 14:00:51 crc kubenswrapper[4867]: I1006 14:00:51.638977 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rf57b" podUID="000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" containerName="registry-server" containerID="cri-o://f3a2852a846def6b2803e497c8b23aa0e40c62118b530c186d6cd79997e0a145" gracePeriod=2 Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.195481 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.286919 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7gqg\" (UniqueName: \"kubernetes.io/projected/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-kube-api-access-f7gqg\") pod \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\" (UID: \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\") " Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.287207 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-catalog-content\") pod \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\" (UID: \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\") " Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.287984 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-utilities\") pod \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\" (UID: \"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96\") " Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.289312 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-utilities" (OuterVolumeSpecName: "utilities") pod "000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" (UID: "000b01f5-5663-48d1-a9a0-e2fe3f3c3f96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.294195 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-kube-api-access-f7gqg" (OuterVolumeSpecName: "kube-api-access-f7gqg") pod "000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" (UID: "000b01f5-5663-48d1-a9a0-e2fe3f3c3f96"). InnerVolumeSpecName "kube-api-access-f7gqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.391983 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.392033 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7gqg\" (UniqueName: \"kubernetes.io/projected/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-kube-api-access-f7gqg\") on node \"crc\" DevicePath \"\"" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.508677 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" (UID: "000b01f5-5663-48d1-a9a0-e2fe3f3c3f96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.598108 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.672427 4867 generic.go:334] "Generic (PLEG): container finished" podID="000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" containerID="f3a2852a846def6b2803e497c8b23aa0e40c62118b530c186d6cd79997e0a145" exitCode=0 Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.672509 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rf57b" event={"ID":"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96","Type":"ContainerDied","Data":"f3a2852a846def6b2803e497c8b23aa0e40c62118b530c186d6cd79997e0a145"} Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.672553 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rf57b" event={"ID":"000b01f5-5663-48d1-a9a0-e2fe3f3c3f96","Type":"ContainerDied","Data":"94dea1b83846da9bfd9ceb95761abc1a6a663adf7ff39fb2391092ecc6c15550"} Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.672606 4867 scope.go:117] "RemoveContainer" containerID="f3a2852a846def6b2803e497c8b23aa0e40c62118b530c186d6cd79997e0a145" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.673016 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rf57b" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.720287 4867 scope.go:117] "RemoveContainer" containerID="6aac06163cdef360f5bdd8097dc1338f53d1dfd21c318f615555b84598006095" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.735435 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rf57b"] Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.744102 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rf57b"] Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.749208 4867 scope.go:117] "RemoveContainer" containerID="f5df96a0966f86eeff3690aa7a5d0d618cea662904207ac59669aa4746f5bb65" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.799476 4867 scope.go:117] "RemoveContainer" containerID="f3a2852a846def6b2803e497c8b23aa0e40c62118b530c186d6cd79997e0a145" Oct 06 14:00:52 crc kubenswrapper[4867]: E1006 14:00:52.800426 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3a2852a846def6b2803e497c8b23aa0e40c62118b530c186d6cd79997e0a145\": container with ID starting with f3a2852a846def6b2803e497c8b23aa0e40c62118b530c186d6cd79997e0a145 not found: ID does not exist" containerID="f3a2852a846def6b2803e497c8b23aa0e40c62118b530c186d6cd79997e0a145" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.800501 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a2852a846def6b2803e497c8b23aa0e40c62118b530c186d6cd79997e0a145"} err="failed to get container status \"f3a2852a846def6b2803e497c8b23aa0e40c62118b530c186d6cd79997e0a145\": rpc error: code = NotFound desc = could not find container \"f3a2852a846def6b2803e497c8b23aa0e40c62118b530c186d6cd79997e0a145\": container with ID starting with f3a2852a846def6b2803e497c8b23aa0e40c62118b530c186d6cd79997e0a145 not found: ID does not exist" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.800547 4867 scope.go:117] "RemoveContainer" containerID="6aac06163cdef360f5bdd8097dc1338f53d1dfd21c318f615555b84598006095" Oct 06 14:00:52 crc kubenswrapper[4867]: E1006 14:00:52.801991 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aac06163cdef360f5bdd8097dc1338f53d1dfd21c318f615555b84598006095\": container with ID starting with 6aac06163cdef360f5bdd8097dc1338f53d1dfd21c318f615555b84598006095 not found: ID does not exist" containerID="6aac06163cdef360f5bdd8097dc1338f53d1dfd21c318f615555b84598006095" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.802034 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aac06163cdef360f5bdd8097dc1338f53d1dfd21c318f615555b84598006095"} err="failed to get container status \"6aac06163cdef360f5bdd8097dc1338f53d1dfd21c318f615555b84598006095\": rpc error: code = NotFound desc = could not find container \"6aac06163cdef360f5bdd8097dc1338f53d1dfd21c318f615555b84598006095\": container with ID starting with 6aac06163cdef360f5bdd8097dc1338f53d1dfd21c318f615555b84598006095 not found: ID does not exist" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.802064 4867 scope.go:117] "RemoveContainer" containerID="f5df96a0966f86eeff3690aa7a5d0d618cea662904207ac59669aa4746f5bb65" Oct 06 14:00:52 crc kubenswrapper[4867]: E1006 14:00:52.802479 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5df96a0966f86eeff3690aa7a5d0d618cea662904207ac59669aa4746f5bb65\": container with ID starting with f5df96a0966f86eeff3690aa7a5d0d618cea662904207ac59669aa4746f5bb65 not found: ID does not exist" containerID="f5df96a0966f86eeff3690aa7a5d0d618cea662904207ac59669aa4746f5bb65" Oct 06 14:00:52 crc kubenswrapper[4867]: I1006 14:00:52.802529 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5df96a0966f86eeff3690aa7a5d0d618cea662904207ac59669aa4746f5bb65"} err="failed to get container status \"f5df96a0966f86eeff3690aa7a5d0d618cea662904207ac59669aa4746f5bb65\": rpc error: code = NotFound desc = could not find container \"f5df96a0966f86eeff3690aa7a5d0d618cea662904207ac59669aa4746f5bb65\": container with ID starting with f5df96a0966f86eeff3690aa7a5d0d618cea662904207ac59669aa4746f5bb65 not found: ID does not exist" Oct 06 14:00:53 crc kubenswrapper[4867]: I1006 14:00:53.234222 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" path="/var/lib/kubelet/pods/000b01f5-5663-48d1-a9a0-e2fe3f3c3f96/volumes" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.158379 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29329321-p9bh2"] Oct 06 14:01:00 crc kubenswrapper[4867]: E1006 14:01:00.159592 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" containerName="extract-content" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.159610 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" containerName="extract-content" Oct 06 14:01:00 crc kubenswrapper[4867]: E1006 14:01:00.159635 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" containerName="extract-utilities" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.159644 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" containerName="extract-utilities" Oct 06 14:01:00 crc kubenswrapper[4867]: E1006 14:01:00.159663 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" containerName="registry-server" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.159669 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" containerName="registry-server" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.159902 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="000b01f5-5663-48d1-a9a0-e2fe3f3c3f96" containerName="registry-server" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.160942 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.173832 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29329321-p9bh2"] Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.178635 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-fernet-keys\") pod \"keystone-cron-29329321-p9bh2\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.179049 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-combined-ca-bundle\") pod \"keystone-cron-29329321-p9bh2\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.179144 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk9c4\" (UniqueName: \"kubernetes.io/projected/ceb3352a-f644-4721-9e76-8c27cb9e26ac-kube-api-access-pk9c4\") pod \"keystone-cron-29329321-p9bh2\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.179275 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-config-data\") pod \"keystone-cron-29329321-p9bh2\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.282274 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-combined-ca-bundle\") pod \"keystone-cron-29329321-p9bh2\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.282345 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk9c4\" (UniqueName: \"kubernetes.io/projected/ceb3352a-f644-4721-9e76-8c27cb9e26ac-kube-api-access-pk9c4\") pod \"keystone-cron-29329321-p9bh2\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.282382 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-config-data\") pod \"keystone-cron-29329321-p9bh2\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.282482 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-fernet-keys\") pod \"keystone-cron-29329321-p9bh2\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.295949 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-config-data\") pod \"keystone-cron-29329321-p9bh2\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.299482 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-fernet-keys\") pod \"keystone-cron-29329321-p9bh2\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.299533 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-combined-ca-bundle\") pod \"keystone-cron-29329321-p9bh2\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.307389 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk9c4\" (UniqueName: \"kubernetes.io/projected/ceb3352a-f644-4721-9e76-8c27cb9e26ac-kube-api-access-pk9c4\") pod \"keystone-cron-29329321-p9bh2\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.502819 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:00 crc kubenswrapper[4867]: I1006 14:01:00.991485 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29329321-p9bh2"] Oct 06 14:01:01 crc kubenswrapper[4867]: I1006 14:01:01.795235 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329321-p9bh2" event={"ID":"ceb3352a-f644-4721-9e76-8c27cb9e26ac","Type":"ContainerStarted","Data":"99d8cfee418710e28c542615ec10c16da1b4648296a2c38efb873ed31765ecea"} Oct 06 14:01:01 crc kubenswrapper[4867]: I1006 14:01:01.795695 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329321-p9bh2" event={"ID":"ceb3352a-f644-4721-9e76-8c27cb9e26ac","Type":"ContainerStarted","Data":"64501c501a987d8e1b363afdb60ae908a12ab0748ce1ae92455975b057b1bce7"} Oct 06 14:01:01 crc kubenswrapper[4867]: I1006 14:01:01.819975 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29329321-p9bh2" podStartSLOduration=1.819956847 podStartE2EDuration="1.819956847s" podCreationTimestamp="2025-10-06 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:01:01.815080084 +0000 UTC m=+3441.273028228" watchObservedRunningTime="2025-10-06 14:01:01.819956847 +0000 UTC m=+3441.277904991" Oct 06 14:01:03 crc kubenswrapper[4867]: I1006 14:01:03.293989 4867 scope.go:117] "RemoveContainer" containerID="48072206700e8e52d49ddad0cd2d2e4e13327965a3654a4cfb8f24e4fa2ff144" Oct 06 14:01:04 crc kubenswrapper[4867]: I1006 14:01:04.850909 4867 generic.go:334] "Generic (PLEG): container finished" podID="ceb3352a-f644-4721-9e76-8c27cb9e26ac" containerID="99d8cfee418710e28c542615ec10c16da1b4648296a2c38efb873ed31765ecea" exitCode=0 Oct 06 14:01:04 crc kubenswrapper[4867]: I1006 14:01:04.851669 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329321-p9bh2" event={"ID":"ceb3352a-f644-4721-9e76-8c27cb9e26ac","Type":"ContainerDied","Data":"99d8cfee418710e28c542615ec10c16da1b4648296a2c38efb873ed31765ecea"} Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.238018 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.309605 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-combined-ca-bundle\") pod \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.309682 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-config-data\") pod \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.309719 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk9c4\" (UniqueName: \"kubernetes.io/projected/ceb3352a-f644-4721-9e76-8c27cb9e26ac-kube-api-access-pk9c4\") pod \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.309785 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-fernet-keys\") pod \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\" (UID: \"ceb3352a-f644-4721-9e76-8c27cb9e26ac\") " Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.319594 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ceb3352a-f644-4721-9e76-8c27cb9e26ac" (UID: "ceb3352a-f644-4721-9e76-8c27cb9e26ac"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.319612 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb3352a-f644-4721-9e76-8c27cb9e26ac-kube-api-access-pk9c4" (OuterVolumeSpecName: "kube-api-access-pk9c4") pod "ceb3352a-f644-4721-9e76-8c27cb9e26ac" (UID: "ceb3352a-f644-4721-9e76-8c27cb9e26ac"). InnerVolumeSpecName "kube-api-access-pk9c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.348685 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ceb3352a-f644-4721-9e76-8c27cb9e26ac" (UID: "ceb3352a-f644-4721-9e76-8c27cb9e26ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.375144 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-config-data" (OuterVolumeSpecName: "config-data") pod "ceb3352a-f644-4721-9e76-8c27cb9e26ac" (UID: "ceb3352a-f644-4721-9e76-8c27cb9e26ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.411993 4867 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.412035 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.412044 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk9c4\" (UniqueName: \"kubernetes.io/projected/ceb3352a-f644-4721-9e76-8c27cb9e26ac-kube-api-access-pk9c4\") on node \"crc\" DevicePath \"\"" Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.412054 4867 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ceb3352a-f644-4721-9e76-8c27cb9e26ac-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.874456 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329321-p9bh2" event={"ID":"ceb3352a-f644-4721-9e76-8c27cb9e26ac","Type":"ContainerDied","Data":"64501c501a987d8e1b363afdb60ae908a12ab0748ce1ae92455975b057b1bce7"} Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.874502 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64501c501a987d8e1b363afdb60ae908a12ab0748ce1ae92455975b057b1bce7" Oct 06 14:01:06 crc kubenswrapper[4867]: I1006 14:01:06.874515 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329321-p9bh2" Oct 06 14:03:12 crc kubenswrapper[4867]: I1006 14:03:12.873969 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:03:12 crc kubenswrapper[4867]: I1006 14:03:12.874673 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:03:42 crc kubenswrapper[4867]: I1006 14:03:42.873464 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:03:42 crc kubenswrapper[4867]: I1006 14:03:42.874066 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:04:12 crc kubenswrapper[4867]: I1006 14:04:12.873815 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:04:12 crc kubenswrapper[4867]: I1006 14:04:12.874759 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:04:12 crc kubenswrapper[4867]: I1006 14:04:12.874824 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 14:04:12 crc kubenswrapper[4867]: I1006 14:04:12.876385 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"882abd54c7802eefd854cdc2aeafb2f94092441b75c4e4174c09ec896ef711cd"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 14:04:12 crc kubenswrapper[4867]: I1006 14:04:12.876498 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://882abd54c7802eefd854cdc2aeafb2f94092441b75c4e4174c09ec896ef711cd" gracePeriod=600 Oct 06 14:04:13 crc kubenswrapper[4867]: I1006 14:04:13.788910 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="882abd54c7802eefd854cdc2aeafb2f94092441b75c4e4174c09ec896ef711cd" exitCode=0 Oct 06 14:04:13 crc kubenswrapper[4867]: I1006 14:04:13.789013 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"882abd54c7802eefd854cdc2aeafb2f94092441b75c4e4174c09ec896ef711cd"} Oct 06 14:04:13 crc kubenswrapper[4867]: I1006 14:04:13.790168 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498"} Oct 06 14:04:13 crc kubenswrapper[4867]: I1006 14:04:13.790229 4867 scope.go:117] "RemoveContainer" containerID="8547064a0533bd97499a551e9057a6f4cf65a9a613cc615b06f5c836f7f7d522" Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.388207 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bg9dq"] Oct 06 14:05:30 crc kubenswrapper[4867]: E1006 14:05:30.389325 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb3352a-f644-4721-9e76-8c27cb9e26ac" containerName="keystone-cron" Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.389345 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb3352a-f644-4721-9e76-8c27cb9e26ac" containerName="keystone-cron" Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.389589 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceb3352a-f644-4721-9e76-8c27cb9e26ac" containerName="keystone-cron" Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.391182 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.404663 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bg9dq"] Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.452085 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d513f35-2ba3-4503-9d75-16365766357a-utilities\") pod \"redhat-marketplace-bg9dq\" (UID: \"2d513f35-2ba3-4503-9d75-16365766357a\") " pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.452173 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm2v2\" (UniqueName: \"kubernetes.io/projected/2d513f35-2ba3-4503-9d75-16365766357a-kube-api-access-mm2v2\") pod \"redhat-marketplace-bg9dq\" (UID: \"2d513f35-2ba3-4503-9d75-16365766357a\") " pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.452200 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d513f35-2ba3-4503-9d75-16365766357a-catalog-content\") pod \"redhat-marketplace-bg9dq\" (UID: \"2d513f35-2ba3-4503-9d75-16365766357a\") " pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.554338 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d513f35-2ba3-4503-9d75-16365766357a-utilities\") pod \"redhat-marketplace-bg9dq\" (UID: \"2d513f35-2ba3-4503-9d75-16365766357a\") " pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.554408 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm2v2\" (UniqueName: \"kubernetes.io/projected/2d513f35-2ba3-4503-9d75-16365766357a-kube-api-access-mm2v2\") pod \"redhat-marketplace-bg9dq\" (UID: \"2d513f35-2ba3-4503-9d75-16365766357a\") " pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.554437 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d513f35-2ba3-4503-9d75-16365766357a-catalog-content\") pod \"redhat-marketplace-bg9dq\" (UID: \"2d513f35-2ba3-4503-9d75-16365766357a\") " pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.554971 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d513f35-2ba3-4503-9d75-16365766357a-utilities\") pod \"redhat-marketplace-bg9dq\" (UID: \"2d513f35-2ba3-4503-9d75-16365766357a\") " pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.554993 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d513f35-2ba3-4503-9d75-16365766357a-catalog-content\") pod \"redhat-marketplace-bg9dq\" (UID: \"2d513f35-2ba3-4503-9d75-16365766357a\") " pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.579182 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm2v2\" (UniqueName: \"kubernetes.io/projected/2d513f35-2ba3-4503-9d75-16365766357a-kube-api-access-mm2v2\") pod \"redhat-marketplace-bg9dq\" (UID: \"2d513f35-2ba3-4503-9d75-16365766357a\") " pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:30 crc kubenswrapper[4867]: I1006 14:05:30.729648 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:31 crc kubenswrapper[4867]: I1006 14:05:31.315480 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bg9dq"] Oct 06 14:05:31 crc kubenswrapper[4867]: I1006 14:05:31.656739 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bg9dq" event={"ID":"2d513f35-2ba3-4503-9d75-16365766357a","Type":"ContainerStarted","Data":"80ef70624dab6257c6293df3aaa5d2b6c858a796567351679804b83e4511c1da"} Oct 06 14:05:31 crc kubenswrapper[4867]: I1006 14:05:31.657155 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bg9dq" event={"ID":"2d513f35-2ba3-4503-9d75-16365766357a","Type":"ContainerStarted","Data":"c9d6eb873eb02714195cf6c21f6ce8b3dd2270c9d82774d99515e5c5ff13b4ab"} Oct 06 14:05:32 crc kubenswrapper[4867]: I1006 14:05:32.666853 4867 generic.go:334] "Generic (PLEG): container finished" podID="2d513f35-2ba3-4503-9d75-16365766357a" containerID="80ef70624dab6257c6293df3aaa5d2b6c858a796567351679804b83e4511c1da" exitCode=0 Oct 06 14:05:32 crc kubenswrapper[4867]: I1006 14:05:32.666908 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bg9dq" event={"ID":"2d513f35-2ba3-4503-9d75-16365766357a","Type":"ContainerDied","Data":"80ef70624dab6257c6293df3aaa5d2b6c858a796567351679804b83e4511c1da"} Oct 06 14:05:34 crc kubenswrapper[4867]: I1006 14:05:34.686752 4867 generic.go:334] "Generic (PLEG): container finished" podID="2d513f35-2ba3-4503-9d75-16365766357a" containerID="2b498c7ade2722ea6b013fafc28e63d01c4d8b7968428ae0e4d79592fb0d154c" exitCode=0 Oct 06 14:05:34 crc kubenswrapper[4867]: I1006 14:05:34.686860 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bg9dq" event={"ID":"2d513f35-2ba3-4503-9d75-16365766357a","Type":"ContainerDied","Data":"2b498c7ade2722ea6b013fafc28e63d01c4d8b7968428ae0e4d79592fb0d154c"} Oct 06 14:05:35 crc kubenswrapper[4867]: I1006 14:05:35.698944 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bg9dq" event={"ID":"2d513f35-2ba3-4503-9d75-16365766357a","Type":"ContainerStarted","Data":"cfececaccc6805c86719f05dbfbf4564b7e9cd2f52f7fa419e7207e8f45e5ca6"} Oct 06 14:05:35 crc kubenswrapper[4867]: I1006 14:05:35.723172 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bg9dq" podStartSLOduration=2.93484907 podStartE2EDuration="5.723149833s" podCreationTimestamp="2025-10-06 14:05:30 +0000 UTC" firstStartedPulling="2025-10-06 14:05:32.669273535 +0000 UTC m=+3712.127221679" lastFinishedPulling="2025-10-06 14:05:35.457574298 +0000 UTC m=+3714.915522442" observedRunningTime="2025-10-06 14:05:35.715949427 +0000 UTC m=+3715.173897581" watchObservedRunningTime="2025-10-06 14:05:35.723149833 +0000 UTC m=+3715.181097977" Oct 06 14:05:40 crc kubenswrapper[4867]: I1006 14:05:40.730600 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:40 crc kubenswrapper[4867]: I1006 14:05:40.731133 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:40 crc kubenswrapper[4867]: I1006 14:05:40.809508 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:40 crc kubenswrapper[4867]: I1006 14:05:40.875723 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:41 crc kubenswrapper[4867]: I1006 14:05:41.052447 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bg9dq"] Oct 06 14:05:42 crc kubenswrapper[4867]: I1006 14:05:42.760354 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bg9dq" podUID="2d513f35-2ba3-4503-9d75-16365766357a" containerName="registry-server" containerID="cri-o://cfececaccc6805c86719f05dbfbf4564b7e9cd2f52f7fa419e7207e8f45e5ca6" gracePeriod=2 Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.265547 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.431869 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d513f35-2ba3-4503-9d75-16365766357a-utilities\") pod \"2d513f35-2ba3-4503-9d75-16365766357a\" (UID: \"2d513f35-2ba3-4503-9d75-16365766357a\") " Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.432213 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d513f35-2ba3-4503-9d75-16365766357a-catalog-content\") pod \"2d513f35-2ba3-4503-9d75-16365766357a\" (UID: \"2d513f35-2ba3-4503-9d75-16365766357a\") " Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.432287 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm2v2\" (UniqueName: \"kubernetes.io/projected/2d513f35-2ba3-4503-9d75-16365766357a-kube-api-access-mm2v2\") pod \"2d513f35-2ba3-4503-9d75-16365766357a\" (UID: \"2d513f35-2ba3-4503-9d75-16365766357a\") " Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.433005 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d513f35-2ba3-4503-9d75-16365766357a-utilities" (OuterVolumeSpecName: "utilities") pod "2d513f35-2ba3-4503-9d75-16365766357a" (UID: "2d513f35-2ba3-4503-9d75-16365766357a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.441738 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d513f35-2ba3-4503-9d75-16365766357a-kube-api-access-mm2v2" (OuterVolumeSpecName: "kube-api-access-mm2v2") pod "2d513f35-2ba3-4503-9d75-16365766357a" (UID: "2d513f35-2ba3-4503-9d75-16365766357a"). InnerVolumeSpecName "kube-api-access-mm2v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.447925 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d513f35-2ba3-4503-9d75-16365766357a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d513f35-2ba3-4503-9d75-16365766357a" (UID: "2d513f35-2ba3-4503-9d75-16365766357a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.535157 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d513f35-2ba3-4503-9d75-16365766357a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.535195 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm2v2\" (UniqueName: \"kubernetes.io/projected/2d513f35-2ba3-4503-9d75-16365766357a-kube-api-access-mm2v2\") on node \"crc\" DevicePath \"\"" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.535208 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d513f35-2ba3-4503-9d75-16365766357a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.773011 4867 generic.go:334] "Generic (PLEG): container finished" podID="2d513f35-2ba3-4503-9d75-16365766357a" containerID="cfececaccc6805c86719f05dbfbf4564b7e9cd2f52f7fa419e7207e8f45e5ca6" exitCode=0 Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.773068 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bg9dq" event={"ID":"2d513f35-2ba3-4503-9d75-16365766357a","Type":"ContainerDied","Data":"cfececaccc6805c86719f05dbfbf4564b7e9cd2f52f7fa419e7207e8f45e5ca6"} Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.773102 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bg9dq" event={"ID":"2d513f35-2ba3-4503-9d75-16365766357a","Type":"ContainerDied","Data":"c9d6eb873eb02714195cf6c21f6ce8b3dd2270c9d82774d99515e5c5ff13b4ab"} Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.773126 4867 scope.go:117] "RemoveContainer" containerID="cfececaccc6805c86719f05dbfbf4564b7e9cd2f52f7fa419e7207e8f45e5ca6" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.773299 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bg9dq" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.800833 4867 scope.go:117] "RemoveContainer" containerID="2b498c7ade2722ea6b013fafc28e63d01c4d8b7968428ae0e4d79592fb0d154c" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.809535 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bg9dq"] Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.819555 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bg9dq"] Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.825231 4867 scope.go:117] "RemoveContainer" containerID="80ef70624dab6257c6293df3aaa5d2b6c858a796567351679804b83e4511c1da" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.875122 4867 scope.go:117] "RemoveContainer" containerID="cfececaccc6805c86719f05dbfbf4564b7e9cd2f52f7fa419e7207e8f45e5ca6" Oct 06 14:05:43 crc kubenswrapper[4867]: E1006 14:05:43.875932 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfececaccc6805c86719f05dbfbf4564b7e9cd2f52f7fa419e7207e8f45e5ca6\": container with ID starting with cfececaccc6805c86719f05dbfbf4564b7e9cd2f52f7fa419e7207e8f45e5ca6 not found: ID does not exist" containerID="cfececaccc6805c86719f05dbfbf4564b7e9cd2f52f7fa419e7207e8f45e5ca6" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.875964 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfececaccc6805c86719f05dbfbf4564b7e9cd2f52f7fa419e7207e8f45e5ca6"} err="failed to get container status \"cfececaccc6805c86719f05dbfbf4564b7e9cd2f52f7fa419e7207e8f45e5ca6\": rpc error: code = NotFound desc = could not find container \"cfececaccc6805c86719f05dbfbf4564b7e9cd2f52f7fa419e7207e8f45e5ca6\": container with ID starting with cfececaccc6805c86719f05dbfbf4564b7e9cd2f52f7fa419e7207e8f45e5ca6 not found: ID does not exist" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.875989 4867 scope.go:117] "RemoveContainer" containerID="2b498c7ade2722ea6b013fafc28e63d01c4d8b7968428ae0e4d79592fb0d154c" Oct 06 14:05:43 crc kubenswrapper[4867]: E1006 14:05:43.880783 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b498c7ade2722ea6b013fafc28e63d01c4d8b7968428ae0e4d79592fb0d154c\": container with ID starting with 2b498c7ade2722ea6b013fafc28e63d01c4d8b7968428ae0e4d79592fb0d154c not found: ID does not exist" containerID="2b498c7ade2722ea6b013fafc28e63d01c4d8b7968428ae0e4d79592fb0d154c" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.880814 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b498c7ade2722ea6b013fafc28e63d01c4d8b7968428ae0e4d79592fb0d154c"} err="failed to get container status \"2b498c7ade2722ea6b013fafc28e63d01c4d8b7968428ae0e4d79592fb0d154c\": rpc error: code = NotFound desc = could not find container \"2b498c7ade2722ea6b013fafc28e63d01c4d8b7968428ae0e4d79592fb0d154c\": container with ID starting with 2b498c7ade2722ea6b013fafc28e63d01c4d8b7968428ae0e4d79592fb0d154c not found: ID does not exist" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.880838 4867 scope.go:117] "RemoveContainer" containerID="80ef70624dab6257c6293df3aaa5d2b6c858a796567351679804b83e4511c1da" Oct 06 14:05:43 crc kubenswrapper[4867]: E1006 14:05:43.881321 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ef70624dab6257c6293df3aaa5d2b6c858a796567351679804b83e4511c1da\": container with ID starting with 80ef70624dab6257c6293df3aaa5d2b6c858a796567351679804b83e4511c1da not found: ID does not exist" containerID="80ef70624dab6257c6293df3aaa5d2b6c858a796567351679804b83e4511c1da" Oct 06 14:05:43 crc kubenswrapper[4867]: I1006 14:05:43.881350 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ef70624dab6257c6293df3aaa5d2b6c858a796567351679804b83e4511c1da"} err="failed to get container status \"80ef70624dab6257c6293df3aaa5d2b6c858a796567351679804b83e4511c1da\": rpc error: code = NotFound desc = could not find container \"80ef70624dab6257c6293df3aaa5d2b6c858a796567351679804b83e4511c1da\": container with ID starting with 80ef70624dab6257c6293df3aaa5d2b6c858a796567351679804b83e4511c1da not found: ID does not exist" Oct 06 14:05:45 crc kubenswrapper[4867]: I1006 14:05:45.234548 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d513f35-2ba3-4503-9d75-16365766357a" path="/var/lib/kubelet/pods/2d513f35-2ba3-4503-9d75-16365766357a/volumes" Oct 06 14:06:42 crc kubenswrapper[4867]: I1006 14:06:42.873851 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:06:42 crc kubenswrapper[4867]: I1006 14:06:42.874416 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:07:12 crc kubenswrapper[4867]: I1006 14:07:12.873661 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:07:12 crc kubenswrapper[4867]: I1006 14:07:12.874627 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:07:42 crc kubenswrapper[4867]: I1006 14:07:42.873727 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:07:42 crc kubenswrapper[4867]: I1006 14:07:42.874788 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:07:42 crc kubenswrapper[4867]: I1006 14:07:42.874880 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 14:07:42 crc kubenswrapper[4867]: I1006 14:07:42.876209 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 14:07:42 crc kubenswrapper[4867]: I1006 14:07:42.876426 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" gracePeriod=600 Oct 06 14:07:43 crc kubenswrapper[4867]: E1006 14:07:43.020478 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:07:43 crc kubenswrapper[4867]: I1006 14:07:43.999220 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" exitCode=0 Oct 06 14:07:43 crc kubenswrapper[4867]: I1006 14:07:43.999297 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498"} Oct 06 14:07:43 crc kubenswrapper[4867]: I1006 14:07:43.999719 4867 scope.go:117] "RemoveContainer" containerID="882abd54c7802eefd854cdc2aeafb2f94092441b75c4e4174c09ec896ef711cd" Oct 06 14:07:44 crc kubenswrapper[4867]: I1006 14:07:44.000381 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:07:44 crc kubenswrapper[4867]: E1006 14:07:44.000694 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:07:57 crc kubenswrapper[4867]: I1006 14:07:57.221233 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:07:57 crc kubenswrapper[4867]: E1006 14:07:57.222202 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:08:12 crc kubenswrapper[4867]: I1006 14:08:12.221851 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:08:12 crc kubenswrapper[4867]: E1006 14:08:12.223067 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:08:27 crc kubenswrapper[4867]: I1006 14:08:27.222502 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:08:27 crc kubenswrapper[4867]: E1006 14:08:27.223488 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:08:42 crc kubenswrapper[4867]: I1006 14:08:42.221706 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:08:42 crc kubenswrapper[4867]: E1006 14:08:42.222534 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.210007 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g9mxp"] Oct 06 14:08:54 crc kubenswrapper[4867]: E1006 14:08:54.211228 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d513f35-2ba3-4503-9d75-16365766357a" containerName="extract-content" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.211243 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d513f35-2ba3-4503-9d75-16365766357a" containerName="extract-content" Oct 06 14:08:54 crc kubenswrapper[4867]: E1006 14:08:54.211295 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d513f35-2ba3-4503-9d75-16365766357a" containerName="registry-server" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.211302 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d513f35-2ba3-4503-9d75-16365766357a" containerName="registry-server" Oct 06 14:08:54 crc kubenswrapper[4867]: E1006 14:08:54.211346 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d513f35-2ba3-4503-9d75-16365766357a" containerName="extract-utilities" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.211355 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d513f35-2ba3-4503-9d75-16365766357a" containerName="extract-utilities" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.211608 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d513f35-2ba3-4503-9d75-16365766357a" containerName="registry-server" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.214745 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.226075 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g9mxp"] Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.355674 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-catalog-content\") pod \"redhat-operators-g9mxp\" (UID: \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\") " pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.355821 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krr5p\" (UniqueName: \"kubernetes.io/projected/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-kube-api-access-krr5p\") pod \"redhat-operators-g9mxp\" (UID: \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\") " pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.355880 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-utilities\") pod \"redhat-operators-g9mxp\" (UID: \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\") " pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.457928 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krr5p\" (UniqueName: \"kubernetes.io/projected/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-kube-api-access-krr5p\") pod \"redhat-operators-g9mxp\" (UID: \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\") " pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.458006 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-utilities\") pod \"redhat-operators-g9mxp\" (UID: \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\") " pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.458163 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-catalog-content\") pod \"redhat-operators-g9mxp\" (UID: \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\") " pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.458668 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-utilities\") pod \"redhat-operators-g9mxp\" (UID: \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\") " pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.458700 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-catalog-content\") pod \"redhat-operators-g9mxp\" (UID: \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\") " pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.481468 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krr5p\" (UniqueName: \"kubernetes.io/projected/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-kube-api-access-krr5p\") pod \"redhat-operators-g9mxp\" (UID: \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\") " pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:08:54 crc kubenswrapper[4867]: I1006 14:08:54.546625 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:08:55 crc kubenswrapper[4867]: I1006 14:08:55.051127 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g9mxp"] Oct 06 14:08:55 crc kubenswrapper[4867]: I1006 14:08:55.221983 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:08:55 crc kubenswrapper[4867]: E1006 14:08:55.222589 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:08:55 crc kubenswrapper[4867]: I1006 14:08:55.847621 4867 generic.go:334] "Generic (PLEG): container finished" podID="bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" containerID="0a5122eb65aab2f0bfb93b25f7d1309861c0c487122c29ee4d246f421a91ae49" exitCode=0 Oct 06 14:08:55 crc kubenswrapper[4867]: I1006 14:08:55.848343 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9mxp" event={"ID":"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e","Type":"ContainerDied","Data":"0a5122eb65aab2f0bfb93b25f7d1309861c0c487122c29ee4d246f421a91ae49"} Oct 06 14:08:55 crc kubenswrapper[4867]: I1006 14:08:55.848463 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9mxp" event={"ID":"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e","Type":"ContainerStarted","Data":"ef57b0b9fd44a576ed2997bb1bfeb341e0726f95440075cd1b435099ab58bb7c"} Oct 06 14:08:55 crc kubenswrapper[4867]: I1006 14:08:55.854380 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 14:09:04 crc kubenswrapper[4867]: I1006 14:09:04.951819 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9mxp" event={"ID":"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e","Type":"ContainerStarted","Data":"8118c5e349ffccd102f892525fe3651ad25d2a3aede17dd886ab9f0563f92b2a"} Oct 06 14:09:06 crc kubenswrapper[4867]: I1006 14:09:06.974588 4867 generic.go:334] "Generic (PLEG): container finished" podID="bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" containerID="8118c5e349ffccd102f892525fe3651ad25d2a3aede17dd886ab9f0563f92b2a" exitCode=0 Oct 06 14:09:06 crc kubenswrapper[4867]: I1006 14:09:06.974654 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9mxp" event={"ID":"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e","Type":"ContainerDied","Data":"8118c5e349ffccd102f892525fe3651ad25d2a3aede17dd886ab9f0563f92b2a"} Oct 06 14:09:07 crc kubenswrapper[4867]: I1006 14:09:07.987503 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9mxp" event={"ID":"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e","Type":"ContainerStarted","Data":"c7ad7adc8925c97c33d433373e6d75ae9e63755eb956f4588a1c8154249f7454"} Oct 06 14:09:08 crc kubenswrapper[4867]: I1006 14:09:08.009465 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g9mxp" podStartSLOduration=2.272627077 podStartE2EDuration="14.009441724s" podCreationTimestamp="2025-10-06 14:08:54 +0000 UTC" firstStartedPulling="2025-10-06 14:08:55.854110476 +0000 UTC m=+3915.312058620" lastFinishedPulling="2025-10-06 14:09:07.590925123 +0000 UTC m=+3927.048873267" observedRunningTime="2025-10-06 14:09:08.005148849 +0000 UTC m=+3927.463097003" watchObservedRunningTime="2025-10-06 14:09:08.009441724 +0000 UTC m=+3927.467389868" Oct 06 14:09:09 crc kubenswrapper[4867]: I1006 14:09:09.221658 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:09:09 crc kubenswrapper[4867]: E1006 14:09:09.222463 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:09:14 crc kubenswrapper[4867]: I1006 14:09:14.547006 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:09:14 crc kubenswrapper[4867]: I1006 14:09:14.547661 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:09:14 crc kubenswrapper[4867]: I1006 14:09:14.598834 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:09:15 crc kubenswrapper[4867]: I1006 14:09:15.101938 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:09:15 crc kubenswrapper[4867]: I1006 14:09:15.173894 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g9mxp"] Oct 06 14:09:15 crc kubenswrapper[4867]: I1006 14:09:15.211536 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6x2pl"] Oct 06 14:09:15 crc kubenswrapper[4867]: I1006 14:09:15.211930 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6x2pl" podUID="5d532425-fb08-45ce-81ae-4e1b31e099d3" containerName="registry-server" containerID="cri-o://87a2e38eeb4ffdb690db4a958b969fe3f826a8e223f62bcbcfab05c026e5c7ed" gracePeriod=2 Oct 06 14:09:15 crc kubenswrapper[4867]: I1006 14:09:15.679504 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 14:09:15 crc kubenswrapper[4867]: I1006 14:09:15.743291 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d532425-fb08-45ce-81ae-4e1b31e099d3-utilities\") pod \"5d532425-fb08-45ce-81ae-4e1b31e099d3\" (UID: \"5d532425-fb08-45ce-81ae-4e1b31e099d3\") " Oct 06 14:09:15 crc kubenswrapper[4867]: I1006 14:09:15.743468 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d532425-fb08-45ce-81ae-4e1b31e099d3-catalog-content\") pod \"5d532425-fb08-45ce-81ae-4e1b31e099d3\" (UID: \"5d532425-fb08-45ce-81ae-4e1b31e099d3\") " Oct 06 14:09:15 crc kubenswrapper[4867]: I1006 14:09:15.743506 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zvxl\" (UniqueName: \"kubernetes.io/projected/5d532425-fb08-45ce-81ae-4e1b31e099d3-kube-api-access-8zvxl\") pod \"5d532425-fb08-45ce-81ae-4e1b31e099d3\" (UID: \"5d532425-fb08-45ce-81ae-4e1b31e099d3\") " Oct 06 14:09:15 crc kubenswrapper[4867]: I1006 14:09:15.750370 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d532425-fb08-45ce-81ae-4e1b31e099d3-utilities" (OuterVolumeSpecName: "utilities") pod "5d532425-fb08-45ce-81ae-4e1b31e099d3" (UID: "5d532425-fb08-45ce-81ae-4e1b31e099d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:09:15 crc kubenswrapper[4867]: I1006 14:09:15.752651 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d532425-fb08-45ce-81ae-4e1b31e099d3-kube-api-access-8zvxl" (OuterVolumeSpecName: "kube-api-access-8zvxl") pod "5d532425-fb08-45ce-81ae-4e1b31e099d3" (UID: "5d532425-fb08-45ce-81ae-4e1b31e099d3"). InnerVolumeSpecName "kube-api-access-8zvxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:09:15 crc kubenswrapper[4867]: I1006 14:09:15.845799 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d532425-fb08-45ce-81ae-4e1b31e099d3-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:09:15 crc kubenswrapper[4867]: I1006 14:09:15.845840 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zvxl\" (UniqueName: \"kubernetes.io/projected/5d532425-fb08-45ce-81ae-4e1b31e099d3-kube-api-access-8zvxl\") on node \"crc\" DevicePath \"\"" Oct 06 14:09:15 crc kubenswrapper[4867]: I1006 14:09:15.874913 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d532425-fb08-45ce-81ae-4e1b31e099d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d532425-fb08-45ce-81ae-4e1b31e099d3" (UID: "5d532425-fb08-45ce-81ae-4e1b31e099d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:09:15 crc kubenswrapper[4867]: I1006 14:09:15.948976 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d532425-fb08-45ce-81ae-4e1b31e099d3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.067230 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6x2pl" Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.067266 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2pl" event={"ID":"5d532425-fb08-45ce-81ae-4e1b31e099d3","Type":"ContainerDied","Data":"87a2e38eeb4ffdb690db4a958b969fe3f826a8e223f62bcbcfab05c026e5c7ed"} Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.067338 4867 scope.go:117] "RemoveContainer" containerID="87a2e38eeb4ffdb690db4a958b969fe3f826a8e223f62bcbcfab05c026e5c7ed" Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.067233 4867 generic.go:334] "Generic (PLEG): container finished" podID="5d532425-fb08-45ce-81ae-4e1b31e099d3" containerID="87a2e38eeb4ffdb690db4a958b969fe3f826a8e223f62bcbcfab05c026e5c7ed" exitCode=0 Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.067389 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2pl" event={"ID":"5d532425-fb08-45ce-81ae-4e1b31e099d3","Type":"ContainerDied","Data":"3bbda80afa0bb35469d1c37649f9a0e73c8bb2741e2b96f1c15664a1459a847e"} Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.093416 4867 scope.go:117] "RemoveContainer" containerID="ebe75cfb3522ae85dd13c13d4ab1c46c7c1cca6a1440e453e78ac42aa8e3d421" Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.101468 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6x2pl"] Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.112391 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6x2pl"] Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.133898 4867 scope.go:117] "RemoveContainer" containerID="01130d36df842e709beeb466d676d11d621e2eacf0af0dbd74c6e6f8f41add12" Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.174998 4867 scope.go:117] "RemoveContainer" containerID="87a2e38eeb4ffdb690db4a958b969fe3f826a8e223f62bcbcfab05c026e5c7ed" Oct 06 14:09:16 crc kubenswrapper[4867]: E1006 14:09:16.175562 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87a2e38eeb4ffdb690db4a958b969fe3f826a8e223f62bcbcfab05c026e5c7ed\": container with ID starting with 87a2e38eeb4ffdb690db4a958b969fe3f826a8e223f62bcbcfab05c026e5c7ed not found: ID does not exist" containerID="87a2e38eeb4ffdb690db4a958b969fe3f826a8e223f62bcbcfab05c026e5c7ed" Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.175627 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a2e38eeb4ffdb690db4a958b969fe3f826a8e223f62bcbcfab05c026e5c7ed"} err="failed to get container status \"87a2e38eeb4ffdb690db4a958b969fe3f826a8e223f62bcbcfab05c026e5c7ed\": rpc error: code = NotFound desc = could not find container \"87a2e38eeb4ffdb690db4a958b969fe3f826a8e223f62bcbcfab05c026e5c7ed\": container with ID starting with 87a2e38eeb4ffdb690db4a958b969fe3f826a8e223f62bcbcfab05c026e5c7ed not found: ID does not exist" Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.175659 4867 scope.go:117] "RemoveContainer" containerID="ebe75cfb3522ae85dd13c13d4ab1c46c7c1cca6a1440e453e78ac42aa8e3d421" Oct 06 14:09:16 crc kubenswrapper[4867]: E1006 14:09:16.176074 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebe75cfb3522ae85dd13c13d4ab1c46c7c1cca6a1440e453e78ac42aa8e3d421\": container with ID starting with ebe75cfb3522ae85dd13c13d4ab1c46c7c1cca6a1440e453e78ac42aa8e3d421 not found: ID does not exist" containerID="ebe75cfb3522ae85dd13c13d4ab1c46c7c1cca6a1440e453e78ac42aa8e3d421" Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.176107 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebe75cfb3522ae85dd13c13d4ab1c46c7c1cca6a1440e453e78ac42aa8e3d421"} err="failed to get container status \"ebe75cfb3522ae85dd13c13d4ab1c46c7c1cca6a1440e453e78ac42aa8e3d421\": rpc error: code = NotFound desc = could not find container \"ebe75cfb3522ae85dd13c13d4ab1c46c7c1cca6a1440e453e78ac42aa8e3d421\": container with ID starting with ebe75cfb3522ae85dd13c13d4ab1c46c7c1cca6a1440e453e78ac42aa8e3d421 not found: ID does not exist" Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.176131 4867 scope.go:117] "RemoveContainer" containerID="01130d36df842e709beeb466d676d11d621e2eacf0af0dbd74c6e6f8f41add12" Oct 06 14:09:16 crc kubenswrapper[4867]: E1006 14:09:16.176605 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01130d36df842e709beeb466d676d11d621e2eacf0af0dbd74c6e6f8f41add12\": container with ID starting with 01130d36df842e709beeb466d676d11d621e2eacf0af0dbd74c6e6f8f41add12 not found: ID does not exist" containerID="01130d36df842e709beeb466d676d11d621e2eacf0af0dbd74c6e6f8f41add12" Oct 06 14:09:16 crc kubenswrapper[4867]: I1006 14:09:16.176637 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01130d36df842e709beeb466d676d11d621e2eacf0af0dbd74c6e6f8f41add12"} err="failed to get container status \"01130d36df842e709beeb466d676d11d621e2eacf0af0dbd74c6e6f8f41add12\": rpc error: code = NotFound desc = could not find container \"01130d36df842e709beeb466d676d11d621e2eacf0af0dbd74c6e6f8f41add12\": container with ID starting with 01130d36df842e709beeb466d676d11d621e2eacf0af0dbd74c6e6f8f41add12 not found: ID does not exist" Oct 06 14:09:17 crc kubenswrapper[4867]: I1006 14:09:17.235303 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d532425-fb08-45ce-81ae-4e1b31e099d3" path="/var/lib/kubelet/pods/5d532425-fb08-45ce-81ae-4e1b31e099d3/volumes" Oct 06 14:09:24 crc kubenswrapper[4867]: I1006 14:09:24.221643 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:09:24 crc kubenswrapper[4867]: E1006 14:09:24.222440 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.107901 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5xx9t"] Oct 06 14:09:32 crc kubenswrapper[4867]: E1006 14:09:32.109583 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d532425-fb08-45ce-81ae-4e1b31e099d3" containerName="registry-server" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.109615 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d532425-fb08-45ce-81ae-4e1b31e099d3" containerName="registry-server" Oct 06 14:09:32 crc kubenswrapper[4867]: E1006 14:09:32.109696 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d532425-fb08-45ce-81ae-4e1b31e099d3" containerName="extract-content" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.109709 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d532425-fb08-45ce-81ae-4e1b31e099d3" containerName="extract-content" Oct 06 14:09:32 crc kubenswrapper[4867]: E1006 14:09:32.109741 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d532425-fb08-45ce-81ae-4e1b31e099d3" containerName="extract-utilities" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.109755 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d532425-fb08-45ce-81ae-4e1b31e099d3" containerName="extract-utilities" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.110123 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d532425-fb08-45ce-81ae-4e1b31e099d3" containerName="registry-server" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.113499 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.121636 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5xx9t"] Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.202550 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwhxg\" (UniqueName: \"kubernetes.io/projected/44a8e2fc-082e-4c07-b85c-99bc82dae152-kube-api-access-zwhxg\") pod \"certified-operators-5xx9t\" (UID: \"44a8e2fc-082e-4c07-b85c-99bc82dae152\") " pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.202852 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a8e2fc-082e-4c07-b85c-99bc82dae152-utilities\") pod \"certified-operators-5xx9t\" (UID: \"44a8e2fc-082e-4c07-b85c-99bc82dae152\") " pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.203068 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a8e2fc-082e-4c07-b85c-99bc82dae152-catalog-content\") pod \"certified-operators-5xx9t\" (UID: \"44a8e2fc-082e-4c07-b85c-99bc82dae152\") " pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.304339 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a8e2fc-082e-4c07-b85c-99bc82dae152-utilities\") pod \"certified-operators-5xx9t\" (UID: \"44a8e2fc-082e-4c07-b85c-99bc82dae152\") " pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.304425 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a8e2fc-082e-4c07-b85c-99bc82dae152-catalog-content\") pod \"certified-operators-5xx9t\" (UID: \"44a8e2fc-082e-4c07-b85c-99bc82dae152\") " pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.304529 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwhxg\" (UniqueName: \"kubernetes.io/projected/44a8e2fc-082e-4c07-b85c-99bc82dae152-kube-api-access-zwhxg\") pod \"certified-operators-5xx9t\" (UID: \"44a8e2fc-082e-4c07-b85c-99bc82dae152\") " pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.304923 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a8e2fc-082e-4c07-b85c-99bc82dae152-utilities\") pod \"certified-operators-5xx9t\" (UID: \"44a8e2fc-082e-4c07-b85c-99bc82dae152\") " pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.304966 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a8e2fc-082e-4c07-b85c-99bc82dae152-catalog-content\") pod \"certified-operators-5xx9t\" (UID: \"44a8e2fc-082e-4c07-b85c-99bc82dae152\") " pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.323619 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwhxg\" (UniqueName: \"kubernetes.io/projected/44a8e2fc-082e-4c07-b85c-99bc82dae152-kube-api-access-zwhxg\") pod \"certified-operators-5xx9t\" (UID: \"44a8e2fc-082e-4c07-b85c-99bc82dae152\") " pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.451029 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:32 crc kubenswrapper[4867]: I1006 14:09:32.989852 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5xx9t"] Oct 06 14:09:32 crc kubenswrapper[4867]: W1006 14:09:32.991220 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44a8e2fc_082e_4c07_b85c_99bc82dae152.slice/crio-b93086f3abc84f8d6b9072815b7db84742156985ec339b9441bff528005fe760 WatchSource:0}: Error finding container b93086f3abc84f8d6b9072815b7db84742156985ec339b9441bff528005fe760: Status 404 returned error can't find the container with id b93086f3abc84f8d6b9072815b7db84742156985ec339b9441bff528005fe760 Oct 06 14:09:33 crc kubenswrapper[4867]: I1006 14:09:33.243043 4867 generic.go:334] "Generic (PLEG): container finished" podID="44a8e2fc-082e-4c07-b85c-99bc82dae152" containerID="5e6ebcf5553a9b4c4e5e1c3a321b25350d031618a004419c25c51205eb1d941b" exitCode=0 Oct 06 14:09:33 crc kubenswrapper[4867]: I1006 14:09:33.243219 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xx9t" event={"ID":"44a8e2fc-082e-4c07-b85c-99bc82dae152","Type":"ContainerDied","Data":"5e6ebcf5553a9b4c4e5e1c3a321b25350d031618a004419c25c51205eb1d941b"} Oct 06 14:09:33 crc kubenswrapper[4867]: I1006 14:09:33.243385 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xx9t" event={"ID":"44a8e2fc-082e-4c07-b85c-99bc82dae152","Type":"ContainerStarted","Data":"b93086f3abc84f8d6b9072815b7db84742156985ec339b9441bff528005fe760"} Oct 06 14:09:35 crc kubenswrapper[4867]: I1006 14:09:35.274539 4867 generic.go:334] "Generic (PLEG): container finished" podID="44a8e2fc-082e-4c07-b85c-99bc82dae152" containerID="c956a4db3c1f3131c8ebe36a0d5f6931e8fc94d2977bc45f36b1acc8c028aeca" exitCode=0 Oct 06 14:09:35 crc kubenswrapper[4867]: I1006 14:09:35.274601 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xx9t" event={"ID":"44a8e2fc-082e-4c07-b85c-99bc82dae152","Type":"ContainerDied","Data":"c956a4db3c1f3131c8ebe36a0d5f6931e8fc94d2977bc45f36b1acc8c028aeca"} Oct 06 14:09:36 crc kubenswrapper[4867]: I1006 14:09:36.223116 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:09:36 crc kubenswrapper[4867]: E1006 14:09:36.223708 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:09:36 crc kubenswrapper[4867]: I1006 14:09:36.292736 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xx9t" event={"ID":"44a8e2fc-082e-4c07-b85c-99bc82dae152","Type":"ContainerStarted","Data":"22a97ec64699335dd02d80c2ea7734d695f503190ca4d368887bd303727ade4a"} Oct 06 14:09:36 crc kubenswrapper[4867]: I1006 14:09:36.328797 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5xx9t" podStartSLOduration=1.7073247249999999 podStartE2EDuration="4.328741384s" podCreationTimestamp="2025-10-06 14:09:32 +0000 UTC" firstStartedPulling="2025-10-06 14:09:33.248446044 +0000 UTC m=+3952.706394188" lastFinishedPulling="2025-10-06 14:09:35.869862703 +0000 UTC m=+3955.327810847" observedRunningTime="2025-10-06 14:09:36.323490054 +0000 UTC m=+3955.781438208" watchObservedRunningTime="2025-10-06 14:09:36.328741384 +0000 UTC m=+3955.786689528" Oct 06 14:09:42 crc kubenswrapper[4867]: I1006 14:09:42.452545 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:42 crc kubenswrapper[4867]: I1006 14:09:42.454037 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:42 crc kubenswrapper[4867]: I1006 14:09:42.500541 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:43 crc kubenswrapper[4867]: I1006 14:09:43.427634 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:43 crc kubenswrapper[4867]: I1006 14:09:43.481800 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5xx9t"] Oct 06 14:09:45 crc kubenswrapper[4867]: I1006 14:09:45.395098 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5xx9t" podUID="44a8e2fc-082e-4c07-b85c-99bc82dae152" containerName="registry-server" containerID="cri-o://22a97ec64699335dd02d80c2ea7734d695f503190ca4d368887bd303727ade4a" gracePeriod=2 Oct 06 14:09:45 crc kubenswrapper[4867]: I1006 14:09:45.953128 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.120638 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a8e2fc-082e-4c07-b85c-99bc82dae152-utilities\") pod \"44a8e2fc-082e-4c07-b85c-99bc82dae152\" (UID: \"44a8e2fc-082e-4c07-b85c-99bc82dae152\") " Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.121128 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a8e2fc-082e-4c07-b85c-99bc82dae152-catalog-content\") pod \"44a8e2fc-082e-4c07-b85c-99bc82dae152\" (UID: \"44a8e2fc-082e-4c07-b85c-99bc82dae152\") " Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.121622 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwhxg\" (UniqueName: \"kubernetes.io/projected/44a8e2fc-082e-4c07-b85c-99bc82dae152-kube-api-access-zwhxg\") pod \"44a8e2fc-082e-4c07-b85c-99bc82dae152\" (UID: \"44a8e2fc-082e-4c07-b85c-99bc82dae152\") " Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.122160 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44a8e2fc-082e-4c07-b85c-99bc82dae152-utilities" (OuterVolumeSpecName: "utilities") pod "44a8e2fc-082e-4c07-b85c-99bc82dae152" (UID: "44a8e2fc-082e-4c07-b85c-99bc82dae152"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.123022 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a8e2fc-082e-4c07-b85c-99bc82dae152-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.131329 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a8e2fc-082e-4c07-b85c-99bc82dae152-kube-api-access-zwhxg" (OuterVolumeSpecName: "kube-api-access-zwhxg") pod "44a8e2fc-082e-4c07-b85c-99bc82dae152" (UID: "44a8e2fc-082e-4c07-b85c-99bc82dae152"). InnerVolumeSpecName "kube-api-access-zwhxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.171426 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44a8e2fc-082e-4c07-b85c-99bc82dae152-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44a8e2fc-082e-4c07-b85c-99bc82dae152" (UID: "44a8e2fc-082e-4c07-b85c-99bc82dae152"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.225534 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwhxg\" (UniqueName: \"kubernetes.io/projected/44a8e2fc-082e-4c07-b85c-99bc82dae152-kube-api-access-zwhxg\") on node \"crc\" DevicePath \"\"" Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.225962 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a8e2fc-082e-4c07-b85c-99bc82dae152-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.410070 4867 generic.go:334] "Generic (PLEG): container finished" podID="44a8e2fc-082e-4c07-b85c-99bc82dae152" containerID="22a97ec64699335dd02d80c2ea7734d695f503190ca4d368887bd303727ade4a" exitCode=0 Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.410176 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xx9t" Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.410171 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xx9t" event={"ID":"44a8e2fc-082e-4c07-b85c-99bc82dae152","Type":"ContainerDied","Data":"22a97ec64699335dd02d80c2ea7734d695f503190ca4d368887bd303727ade4a"} Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.411719 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xx9t" event={"ID":"44a8e2fc-082e-4c07-b85c-99bc82dae152","Type":"ContainerDied","Data":"b93086f3abc84f8d6b9072815b7db84742156985ec339b9441bff528005fe760"} Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.411769 4867 scope.go:117] "RemoveContainer" containerID="22a97ec64699335dd02d80c2ea7734d695f503190ca4d368887bd303727ade4a" Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.438731 4867 scope.go:117] "RemoveContainer" containerID="c956a4db3c1f3131c8ebe36a0d5f6931e8fc94d2977bc45f36b1acc8c028aeca" Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.455970 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5xx9t"] Oct 06 14:09:46 crc kubenswrapper[4867]: I1006 14:09:46.466318 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5xx9t"] Oct 06 14:09:47 crc kubenswrapper[4867]: I1006 14:09:47.110304 4867 scope.go:117] "RemoveContainer" containerID="5e6ebcf5553a9b4c4e5e1c3a321b25350d031618a004419c25c51205eb1d941b" Oct 06 14:09:47 crc kubenswrapper[4867]: I1006 14:09:47.172425 4867 scope.go:117] "RemoveContainer" containerID="22a97ec64699335dd02d80c2ea7734d695f503190ca4d368887bd303727ade4a" Oct 06 14:09:47 crc kubenswrapper[4867]: E1006 14:09:47.173865 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22a97ec64699335dd02d80c2ea7734d695f503190ca4d368887bd303727ade4a\": container with ID starting with 22a97ec64699335dd02d80c2ea7734d695f503190ca4d368887bd303727ade4a not found: ID does not exist" containerID="22a97ec64699335dd02d80c2ea7734d695f503190ca4d368887bd303727ade4a" Oct 06 14:09:47 crc kubenswrapper[4867]: I1006 14:09:47.173943 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22a97ec64699335dd02d80c2ea7734d695f503190ca4d368887bd303727ade4a"} err="failed to get container status \"22a97ec64699335dd02d80c2ea7734d695f503190ca4d368887bd303727ade4a\": rpc error: code = NotFound desc = could not find container \"22a97ec64699335dd02d80c2ea7734d695f503190ca4d368887bd303727ade4a\": container with ID starting with 22a97ec64699335dd02d80c2ea7734d695f503190ca4d368887bd303727ade4a not found: ID does not exist" Oct 06 14:09:47 crc kubenswrapper[4867]: I1006 14:09:47.173989 4867 scope.go:117] "RemoveContainer" containerID="c956a4db3c1f3131c8ebe36a0d5f6931e8fc94d2977bc45f36b1acc8c028aeca" Oct 06 14:09:47 crc kubenswrapper[4867]: E1006 14:09:47.174570 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c956a4db3c1f3131c8ebe36a0d5f6931e8fc94d2977bc45f36b1acc8c028aeca\": container with ID starting with c956a4db3c1f3131c8ebe36a0d5f6931e8fc94d2977bc45f36b1acc8c028aeca not found: ID does not exist" containerID="c956a4db3c1f3131c8ebe36a0d5f6931e8fc94d2977bc45f36b1acc8c028aeca" Oct 06 14:09:47 crc kubenswrapper[4867]: I1006 14:09:47.174613 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c956a4db3c1f3131c8ebe36a0d5f6931e8fc94d2977bc45f36b1acc8c028aeca"} err="failed to get container status \"c956a4db3c1f3131c8ebe36a0d5f6931e8fc94d2977bc45f36b1acc8c028aeca\": rpc error: code = NotFound desc = could not find container \"c956a4db3c1f3131c8ebe36a0d5f6931e8fc94d2977bc45f36b1acc8c028aeca\": container with ID starting with c956a4db3c1f3131c8ebe36a0d5f6931e8fc94d2977bc45f36b1acc8c028aeca not found: ID does not exist" Oct 06 14:09:47 crc kubenswrapper[4867]: I1006 14:09:47.174639 4867 scope.go:117] "RemoveContainer" containerID="5e6ebcf5553a9b4c4e5e1c3a321b25350d031618a004419c25c51205eb1d941b" Oct 06 14:09:47 crc kubenswrapper[4867]: E1006 14:09:47.175077 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e6ebcf5553a9b4c4e5e1c3a321b25350d031618a004419c25c51205eb1d941b\": container with ID starting with 5e6ebcf5553a9b4c4e5e1c3a321b25350d031618a004419c25c51205eb1d941b not found: ID does not exist" containerID="5e6ebcf5553a9b4c4e5e1c3a321b25350d031618a004419c25c51205eb1d941b" Oct 06 14:09:47 crc kubenswrapper[4867]: I1006 14:09:47.175122 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e6ebcf5553a9b4c4e5e1c3a321b25350d031618a004419c25c51205eb1d941b"} err="failed to get container status \"5e6ebcf5553a9b4c4e5e1c3a321b25350d031618a004419c25c51205eb1d941b\": rpc error: code = NotFound desc = could not find container \"5e6ebcf5553a9b4c4e5e1c3a321b25350d031618a004419c25c51205eb1d941b\": container with ID starting with 5e6ebcf5553a9b4c4e5e1c3a321b25350d031618a004419c25c51205eb1d941b not found: ID does not exist" Oct 06 14:09:47 crc kubenswrapper[4867]: I1006 14:09:47.235013 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44a8e2fc-082e-4c07-b85c-99bc82dae152" path="/var/lib/kubelet/pods/44a8e2fc-082e-4c07-b85c-99bc82dae152/volumes" Oct 06 14:09:50 crc kubenswrapper[4867]: I1006 14:09:50.221395 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:09:50 crc kubenswrapper[4867]: E1006 14:09:50.222095 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:10:02 crc kubenswrapper[4867]: I1006 14:10:02.221483 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:10:02 crc kubenswrapper[4867]: E1006 14:10:02.222377 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:10:13 crc kubenswrapper[4867]: I1006 14:10:13.221770 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:10:13 crc kubenswrapper[4867]: E1006 14:10:13.222608 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:10:24 crc kubenswrapper[4867]: I1006 14:10:24.221806 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:10:24 crc kubenswrapper[4867]: E1006 14:10:24.222719 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:10:35 crc kubenswrapper[4867]: I1006 14:10:35.221323 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:10:35 crc kubenswrapper[4867]: E1006 14:10:35.221959 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:10:49 crc kubenswrapper[4867]: I1006 14:10:49.221539 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:10:49 crc kubenswrapper[4867]: E1006 14:10:49.222407 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:11:00 crc kubenswrapper[4867]: I1006 14:11:00.221596 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:11:00 crc kubenswrapper[4867]: E1006 14:11:00.222433 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:11:14 crc kubenswrapper[4867]: I1006 14:11:14.222759 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:11:14 crc kubenswrapper[4867]: E1006 14:11:14.224082 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:11:28 crc kubenswrapper[4867]: I1006 14:11:28.220876 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:11:28 crc kubenswrapper[4867]: E1006 14:11:28.221601 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:11:43 crc kubenswrapper[4867]: I1006 14:11:43.221838 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:11:43 crc kubenswrapper[4867]: E1006 14:11:43.222729 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.265310 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fxc9c"] Oct 06 14:11:53 crc kubenswrapper[4867]: E1006 14:11:53.266239 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a8e2fc-082e-4c07-b85c-99bc82dae152" containerName="extract-utilities" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.266274 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a8e2fc-082e-4c07-b85c-99bc82dae152" containerName="extract-utilities" Oct 06 14:11:53 crc kubenswrapper[4867]: E1006 14:11:53.266292 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a8e2fc-082e-4c07-b85c-99bc82dae152" containerName="registry-server" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.266298 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a8e2fc-082e-4c07-b85c-99bc82dae152" containerName="registry-server" Oct 06 14:11:53 crc kubenswrapper[4867]: E1006 14:11:53.266342 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a8e2fc-082e-4c07-b85c-99bc82dae152" containerName="extract-content" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.266348 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a8e2fc-082e-4c07-b85c-99bc82dae152" containerName="extract-content" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.266687 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a8e2fc-082e-4c07-b85c-99bc82dae152" containerName="registry-server" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.268448 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.287566 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fxc9c"] Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.368701 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-catalog-content\") pod \"community-operators-fxc9c\" (UID: \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\") " pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.368868 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg6mv\" (UniqueName: \"kubernetes.io/projected/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-kube-api-access-vg6mv\") pod \"community-operators-fxc9c\" (UID: \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\") " pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.368992 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-utilities\") pod \"community-operators-fxc9c\" (UID: \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\") " pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.471215 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-utilities\") pod \"community-operators-fxc9c\" (UID: \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\") " pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.471335 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-catalog-content\") pod \"community-operators-fxc9c\" (UID: \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\") " pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.471424 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg6mv\" (UniqueName: \"kubernetes.io/projected/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-kube-api-access-vg6mv\") pod \"community-operators-fxc9c\" (UID: \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\") " pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.471714 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-utilities\") pod \"community-operators-fxc9c\" (UID: \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\") " pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.471964 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-catalog-content\") pod \"community-operators-fxc9c\" (UID: \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\") " pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.493839 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg6mv\" (UniqueName: \"kubernetes.io/projected/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-kube-api-access-vg6mv\") pod \"community-operators-fxc9c\" (UID: \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\") " pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:11:53 crc kubenswrapper[4867]: I1006 14:11:53.587018 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:11:54 crc kubenswrapper[4867]: I1006 14:11:54.164032 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fxc9c"] Oct 06 14:11:54 crc kubenswrapper[4867]: I1006 14:11:54.761193 4867 generic.go:334] "Generic (PLEG): container finished" podID="7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" containerID="5bb39137ec9568e6806c66765747a70653c538f1fec120b23792ae62d10f881d" exitCode=0 Oct 06 14:11:54 crc kubenswrapper[4867]: I1006 14:11:54.761316 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxc9c" event={"ID":"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1","Type":"ContainerDied","Data":"5bb39137ec9568e6806c66765747a70653c538f1fec120b23792ae62d10f881d"} Oct 06 14:11:54 crc kubenswrapper[4867]: I1006 14:11:54.762001 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxc9c" event={"ID":"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1","Type":"ContainerStarted","Data":"d929317633fda25b4d906c28ad903e071828dae8cba959ea12f56603e9047a42"} Oct 06 14:11:55 crc kubenswrapper[4867]: I1006 14:11:55.222510 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:11:55 crc kubenswrapper[4867]: E1006 14:11:55.222902 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:11:56 crc kubenswrapper[4867]: I1006 14:11:56.785815 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxc9c" event={"ID":"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1","Type":"ContainerStarted","Data":"8525d17cbf0407d9878f8e44a73097316a448e0cb559cbbcbd6913d2401252a3"} Oct 06 14:11:57 crc kubenswrapper[4867]: I1006 14:11:57.798221 4867 generic.go:334] "Generic (PLEG): container finished" podID="7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" containerID="8525d17cbf0407d9878f8e44a73097316a448e0cb559cbbcbd6913d2401252a3" exitCode=0 Oct 06 14:11:57 crc kubenswrapper[4867]: I1006 14:11:57.798357 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxc9c" event={"ID":"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1","Type":"ContainerDied","Data":"8525d17cbf0407d9878f8e44a73097316a448e0cb559cbbcbd6913d2401252a3"} Oct 06 14:11:58 crc kubenswrapper[4867]: I1006 14:11:58.809501 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxc9c" event={"ID":"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1","Type":"ContainerStarted","Data":"158dc59658f96b9975584ea77ab8cfc4913fca50162e4f7bc5db24a2a33e38da"} Oct 06 14:11:58 crc kubenswrapper[4867]: I1006 14:11:58.829754 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fxc9c" podStartSLOduration=2.382389806 podStartE2EDuration="5.829735458s" podCreationTimestamp="2025-10-06 14:11:53 +0000 UTC" firstStartedPulling="2025-10-06 14:11:54.763817647 +0000 UTC m=+4094.221765791" lastFinishedPulling="2025-10-06 14:11:58.211163289 +0000 UTC m=+4097.669111443" observedRunningTime="2025-10-06 14:11:58.826434629 +0000 UTC m=+4098.284382773" watchObservedRunningTime="2025-10-06 14:11:58.829735458 +0000 UTC m=+4098.287683602" Oct 06 14:12:03 crc kubenswrapper[4867]: I1006 14:12:03.587756 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:12:03 crc kubenswrapper[4867]: I1006 14:12:03.588365 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:12:03 crc kubenswrapper[4867]: I1006 14:12:03.636363 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:12:03 crc kubenswrapper[4867]: I1006 14:12:03.900347 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:12:03 crc kubenswrapper[4867]: I1006 14:12:03.946640 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fxc9c"] Oct 06 14:12:05 crc kubenswrapper[4867]: I1006 14:12:05.867579 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fxc9c" podUID="7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" containerName="registry-server" containerID="cri-o://158dc59658f96b9975584ea77ab8cfc4913fca50162e4f7bc5db24a2a33e38da" gracePeriod=2 Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.305277 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.433236 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-utilities\") pod \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\" (UID: \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\") " Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.433355 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg6mv\" (UniqueName: \"kubernetes.io/projected/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-kube-api-access-vg6mv\") pod \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\" (UID: \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\") " Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.433484 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-catalog-content\") pod \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\" (UID: \"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1\") " Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.435039 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-utilities" (OuterVolumeSpecName: "utilities") pod "7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" (UID: "7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.440059 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-kube-api-access-vg6mv" (OuterVolumeSpecName: "kube-api-access-vg6mv") pod "7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" (UID: "7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1"). InnerVolumeSpecName "kube-api-access-vg6mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.481635 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" (UID: "7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.536080 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg6mv\" (UniqueName: \"kubernetes.io/projected/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-kube-api-access-vg6mv\") on node \"crc\" DevicePath \"\"" Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.536116 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.536126 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.879420 4867 generic.go:334] "Generic (PLEG): container finished" podID="7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" containerID="158dc59658f96b9975584ea77ab8cfc4913fca50162e4f7bc5db24a2a33e38da" exitCode=0 Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.879475 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxc9c" event={"ID":"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1","Type":"ContainerDied","Data":"158dc59658f96b9975584ea77ab8cfc4913fca50162e4f7bc5db24a2a33e38da"} Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.879495 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxc9c" Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.879523 4867 scope.go:117] "RemoveContainer" containerID="158dc59658f96b9975584ea77ab8cfc4913fca50162e4f7bc5db24a2a33e38da" Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.879511 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxc9c" event={"ID":"7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1","Type":"ContainerDied","Data":"d929317633fda25b4d906c28ad903e071828dae8cba959ea12f56603e9047a42"} Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.915119 4867 scope.go:117] "RemoveContainer" containerID="8525d17cbf0407d9878f8e44a73097316a448e0cb559cbbcbd6913d2401252a3" Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.938802 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fxc9c"] Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.950178 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fxc9c"] Oct 06 14:12:06 crc kubenswrapper[4867]: I1006 14:12:06.966137 4867 scope.go:117] "RemoveContainer" containerID="5bb39137ec9568e6806c66765747a70653c538f1fec120b23792ae62d10f881d" Oct 06 14:12:07 crc kubenswrapper[4867]: I1006 14:12:07.000834 4867 scope.go:117] "RemoveContainer" containerID="158dc59658f96b9975584ea77ab8cfc4913fca50162e4f7bc5db24a2a33e38da" Oct 06 14:12:07 crc kubenswrapper[4867]: E1006 14:12:07.001439 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"158dc59658f96b9975584ea77ab8cfc4913fca50162e4f7bc5db24a2a33e38da\": container with ID starting with 158dc59658f96b9975584ea77ab8cfc4913fca50162e4f7bc5db24a2a33e38da not found: ID does not exist" containerID="158dc59658f96b9975584ea77ab8cfc4913fca50162e4f7bc5db24a2a33e38da" Oct 06 14:12:07 crc kubenswrapper[4867]: I1006 14:12:07.001480 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"158dc59658f96b9975584ea77ab8cfc4913fca50162e4f7bc5db24a2a33e38da"} err="failed to get container status \"158dc59658f96b9975584ea77ab8cfc4913fca50162e4f7bc5db24a2a33e38da\": rpc error: code = NotFound desc = could not find container \"158dc59658f96b9975584ea77ab8cfc4913fca50162e4f7bc5db24a2a33e38da\": container with ID starting with 158dc59658f96b9975584ea77ab8cfc4913fca50162e4f7bc5db24a2a33e38da not found: ID does not exist" Oct 06 14:12:07 crc kubenswrapper[4867]: I1006 14:12:07.001511 4867 scope.go:117] "RemoveContainer" containerID="8525d17cbf0407d9878f8e44a73097316a448e0cb559cbbcbd6913d2401252a3" Oct 06 14:12:07 crc kubenswrapper[4867]: E1006 14:12:07.001785 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8525d17cbf0407d9878f8e44a73097316a448e0cb559cbbcbd6913d2401252a3\": container with ID starting with 8525d17cbf0407d9878f8e44a73097316a448e0cb559cbbcbd6913d2401252a3 not found: ID does not exist" containerID="8525d17cbf0407d9878f8e44a73097316a448e0cb559cbbcbd6913d2401252a3" Oct 06 14:12:07 crc kubenswrapper[4867]: I1006 14:12:07.001831 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8525d17cbf0407d9878f8e44a73097316a448e0cb559cbbcbd6913d2401252a3"} err="failed to get container status \"8525d17cbf0407d9878f8e44a73097316a448e0cb559cbbcbd6913d2401252a3\": rpc error: code = NotFound desc = could not find container \"8525d17cbf0407d9878f8e44a73097316a448e0cb559cbbcbd6913d2401252a3\": container with ID starting with 8525d17cbf0407d9878f8e44a73097316a448e0cb559cbbcbd6913d2401252a3 not found: ID does not exist" Oct 06 14:12:07 crc kubenswrapper[4867]: I1006 14:12:07.001860 4867 scope.go:117] "RemoveContainer" containerID="5bb39137ec9568e6806c66765747a70653c538f1fec120b23792ae62d10f881d" Oct 06 14:12:07 crc kubenswrapper[4867]: E1006 14:12:07.002322 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bb39137ec9568e6806c66765747a70653c538f1fec120b23792ae62d10f881d\": container with ID starting with 5bb39137ec9568e6806c66765747a70653c538f1fec120b23792ae62d10f881d not found: ID does not exist" containerID="5bb39137ec9568e6806c66765747a70653c538f1fec120b23792ae62d10f881d" Oct 06 14:12:07 crc kubenswrapper[4867]: I1006 14:12:07.002352 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bb39137ec9568e6806c66765747a70653c538f1fec120b23792ae62d10f881d"} err="failed to get container status \"5bb39137ec9568e6806c66765747a70653c538f1fec120b23792ae62d10f881d\": rpc error: code = NotFound desc = could not find container \"5bb39137ec9568e6806c66765747a70653c538f1fec120b23792ae62d10f881d\": container with ID starting with 5bb39137ec9568e6806c66765747a70653c538f1fec120b23792ae62d10f881d not found: ID does not exist" Oct 06 14:12:07 crc kubenswrapper[4867]: I1006 14:12:07.234981 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" path="/var/lib/kubelet/pods/7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1/volumes" Oct 06 14:12:08 crc kubenswrapper[4867]: I1006 14:12:08.221659 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:12:08 crc kubenswrapper[4867]: E1006 14:12:08.222209 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:12:20 crc kubenswrapper[4867]: I1006 14:12:20.222366 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:12:20 crc kubenswrapper[4867]: E1006 14:12:20.223091 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:12:32 crc kubenswrapper[4867]: I1006 14:12:32.221707 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:12:32 crc kubenswrapper[4867]: E1006 14:12:32.222320 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:12:47 crc kubenswrapper[4867]: I1006 14:12:47.221650 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:12:48 crc kubenswrapper[4867]: I1006 14:12:48.343106 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"d5d10b7a2e5392a1994d4a7d473042a2aca3aaea20b30126de41dd23a2674699"} Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.173438 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs"] Oct 06 14:15:00 crc kubenswrapper[4867]: E1006 14:15:00.174394 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" containerName="extract-content" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.174409 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" containerName="extract-content" Oct 06 14:15:00 crc kubenswrapper[4867]: E1006 14:15:00.174446 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" containerName="extract-utilities" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.174453 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" containerName="extract-utilities" Oct 06 14:15:00 crc kubenswrapper[4867]: E1006 14:15:00.174469 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" containerName="registry-server" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.174475 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" containerName="registry-server" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.174696 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fc4a002-b5ba-441b-a6ab-8ad4c70eb1c1" containerName="registry-server" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.175607 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.178301 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.182318 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.206488 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs"] Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.278142 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-config-volume\") pod \"collect-profiles-29329335-2zbzs\" (UID: \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.278277 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrwlx\" (UniqueName: \"kubernetes.io/projected/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-kube-api-access-jrwlx\") pod \"collect-profiles-29329335-2zbzs\" (UID: \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.278856 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-secret-volume\") pod \"collect-profiles-29329335-2zbzs\" (UID: \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.383299 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-config-volume\") pod \"collect-profiles-29329335-2zbzs\" (UID: \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.383389 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-config-volume\") pod \"collect-profiles-29329335-2zbzs\" (UID: \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.383481 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrwlx\" (UniqueName: \"kubernetes.io/projected/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-kube-api-access-jrwlx\") pod \"collect-profiles-29329335-2zbzs\" (UID: \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.384180 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-secret-volume\") pod \"collect-profiles-29329335-2zbzs\" (UID: \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.395226 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-secret-volume\") pod \"collect-profiles-29329335-2zbzs\" (UID: \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.411173 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrwlx\" (UniqueName: \"kubernetes.io/projected/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-kube-api-access-jrwlx\") pod \"collect-profiles-29329335-2zbzs\" (UID: \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" Oct 06 14:15:00 crc kubenswrapper[4867]: I1006 14:15:00.498383 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" Oct 06 14:15:01 crc kubenswrapper[4867]: I1006 14:15:01.034635 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs"] Oct 06 14:15:01 crc kubenswrapper[4867]: I1006 14:15:01.802297 4867 generic.go:334] "Generic (PLEG): container finished" podID="3f4a964e-2e33-41a1-bce9-75ed84ee40dd" containerID="7e6c739ce56fe1bcc652091fc965722b711b826be254bbc086b00e09b09b7e34" exitCode=0 Oct 06 14:15:01 crc kubenswrapper[4867]: I1006 14:15:01.802362 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" event={"ID":"3f4a964e-2e33-41a1-bce9-75ed84ee40dd","Type":"ContainerDied","Data":"7e6c739ce56fe1bcc652091fc965722b711b826be254bbc086b00e09b09b7e34"} Oct 06 14:15:01 crc kubenswrapper[4867]: I1006 14:15:01.802398 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" event={"ID":"3f4a964e-2e33-41a1-bce9-75ed84ee40dd","Type":"ContainerStarted","Data":"390b2f5195199bdd86158ec45aa83b69142d76d5ed0253aa34a0d3489cd45946"} Oct 06 14:15:03 crc kubenswrapper[4867]: I1006 14:15:03.262693 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" Oct 06 14:15:03 crc kubenswrapper[4867]: I1006 14:15:03.267640 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrwlx\" (UniqueName: \"kubernetes.io/projected/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-kube-api-access-jrwlx\") pod \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\" (UID: \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\") " Oct 06 14:15:03 crc kubenswrapper[4867]: I1006 14:15:03.278075 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-kube-api-access-jrwlx" (OuterVolumeSpecName: "kube-api-access-jrwlx") pod "3f4a964e-2e33-41a1-bce9-75ed84ee40dd" (UID: "3f4a964e-2e33-41a1-bce9-75ed84ee40dd"). InnerVolumeSpecName "kube-api-access-jrwlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:15:03 crc kubenswrapper[4867]: I1006 14:15:03.370063 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-config-volume\") pod \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\" (UID: \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\") " Oct 06 14:15:03 crc kubenswrapper[4867]: I1006 14:15:03.370177 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-secret-volume\") pod \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\" (UID: \"3f4a964e-2e33-41a1-bce9-75ed84ee40dd\") " Oct 06 14:15:03 crc kubenswrapper[4867]: I1006 14:15:03.371168 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f4a964e-2e33-41a1-bce9-75ed84ee40dd" (UID: "3f4a964e-2e33-41a1-bce9-75ed84ee40dd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:15:03 crc kubenswrapper[4867]: I1006 14:15:03.371562 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 14:15:03 crc kubenswrapper[4867]: I1006 14:15:03.371593 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrwlx\" (UniqueName: \"kubernetes.io/projected/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-kube-api-access-jrwlx\") on node \"crc\" DevicePath \"\"" Oct 06 14:15:03 crc kubenswrapper[4867]: I1006 14:15:03.375193 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3f4a964e-2e33-41a1-bce9-75ed84ee40dd" (UID: "3f4a964e-2e33-41a1-bce9-75ed84ee40dd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:15:03 crc kubenswrapper[4867]: I1006 14:15:03.474422 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f4a964e-2e33-41a1-bce9-75ed84ee40dd-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 14:15:03 crc kubenswrapper[4867]: I1006 14:15:03.828802 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" event={"ID":"3f4a964e-2e33-41a1-bce9-75ed84ee40dd","Type":"ContainerDied","Data":"390b2f5195199bdd86158ec45aa83b69142d76d5ed0253aa34a0d3489cd45946"} Oct 06 14:15:03 crc kubenswrapper[4867]: I1006 14:15:03.828864 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="390b2f5195199bdd86158ec45aa83b69142d76d5ed0253aa34a0d3489cd45946" Oct 06 14:15:03 crc kubenswrapper[4867]: I1006 14:15:03.828887 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329335-2zbzs" Oct 06 14:15:04 crc kubenswrapper[4867]: I1006 14:15:04.373956 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt"] Oct 06 14:15:04 crc kubenswrapper[4867]: I1006 14:15:04.382743 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329290-lz9jt"] Oct 06 14:15:05 crc kubenswrapper[4867]: I1006 14:15:05.236815 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39" path="/var/lib/kubelet/pods/7fc16684-b4e8-4c4e-b26e-9c2b11f6fc39/volumes" Oct 06 14:15:12 crc kubenswrapper[4867]: I1006 14:15:12.874211 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:15:12 crc kubenswrapper[4867]: I1006 14:15:12.875163 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:15:42 crc kubenswrapper[4867]: I1006 14:15:42.873406 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:15:42 crc kubenswrapper[4867]: I1006 14:15:42.874001 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:16:03 crc kubenswrapper[4867]: I1006 14:16:03.784345 4867 scope.go:117] "RemoveContainer" containerID="d4d2020afc3220f8bc4e6634e6c2733dd35251a23cc9c0d03751c753678a3c8b" Oct 06 14:16:12 crc kubenswrapper[4867]: I1006 14:16:12.873303 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:16:12 crc kubenswrapper[4867]: I1006 14:16:12.874214 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:16:12 crc kubenswrapper[4867]: I1006 14:16:12.874291 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 14:16:12 crc kubenswrapper[4867]: I1006 14:16:12.875049 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5d10b7a2e5392a1994d4a7d473042a2aca3aaea20b30126de41dd23a2674699"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 14:16:12 crc kubenswrapper[4867]: I1006 14:16:12.875122 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://d5d10b7a2e5392a1994d4a7d473042a2aca3aaea20b30126de41dd23a2674699" gracePeriod=600 Oct 06 14:16:13 crc kubenswrapper[4867]: I1006 14:16:13.596592 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="d5d10b7a2e5392a1994d4a7d473042a2aca3aaea20b30126de41dd23a2674699" exitCode=0 Oct 06 14:16:13 crc kubenswrapper[4867]: I1006 14:16:13.597600 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"d5d10b7a2e5392a1994d4a7d473042a2aca3aaea20b30126de41dd23a2674699"} Oct 06 14:16:13 crc kubenswrapper[4867]: I1006 14:16:13.597658 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246"} Oct 06 14:16:13 crc kubenswrapper[4867]: I1006 14:16:13.597687 4867 scope.go:117] "RemoveContainer" containerID="14d113821ba221b14e0639d6465cb97c0ae136898c8f0d2e1c53cad77c5a2498" Oct 06 14:18:42 crc kubenswrapper[4867]: I1006 14:18:42.873229 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:18:42 crc kubenswrapper[4867]: I1006 14:18:42.874241 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.081575 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bb5nh"] Oct 06 14:18:57 crc kubenswrapper[4867]: E1006 14:18:57.082729 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4a964e-2e33-41a1-bce9-75ed84ee40dd" containerName="collect-profiles" Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.082750 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4a964e-2e33-41a1-bce9-75ed84ee40dd" containerName="collect-profiles" Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.083101 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4a964e-2e33-41a1-bce9-75ed84ee40dd" containerName="collect-profiles" Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.085588 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.101026 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bb5nh"] Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.193132 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c52987f-80e4-4916-81ef-5e3b4494b810-catalog-content\") pod \"redhat-operators-bb5nh\" (UID: \"6c52987f-80e4-4916-81ef-5e3b4494b810\") " pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.193536 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmfht\" (UniqueName: \"kubernetes.io/projected/6c52987f-80e4-4916-81ef-5e3b4494b810-kube-api-access-pmfht\") pod \"redhat-operators-bb5nh\" (UID: \"6c52987f-80e4-4916-81ef-5e3b4494b810\") " pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.193686 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c52987f-80e4-4916-81ef-5e3b4494b810-utilities\") pod \"redhat-operators-bb5nh\" (UID: \"6c52987f-80e4-4916-81ef-5e3b4494b810\") " pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.295695 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmfht\" (UniqueName: \"kubernetes.io/projected/6c52987f-80e4-4916-81ef-5e3b4494b810-kube-api-access-pmfht\") pod \"redhat-operators-bb5nh\" (UID: \"6c52987f-80e4-4916-81ef-5e3b4494b810\") " pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.296086 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c52987f-80e4-4916-81ef-5e3b4494b810-utilities\") pod \"redhat-operators-bb5nh\" (UID: \"6c52987f-80e4-4916-81ef-5e3b4494b810\") " pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.296310 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c52987f-80e4-4916-81ef-5e3b4494b810-catalog-content\") pod \"redhat-operators-bb5nh\" (UID: \"6c52987f-80e4-4916-81ef-5e3b4494b810\") " pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.296746 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c52987f-80e4-4916-81ef-5e3b4494b810-utilities\") pod \"redhat-operators-bb5nh\" (UID: \"6c52987f-80e4-4916-81ef-5e3b4494b810\") " pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.296789 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c52987f-80e4-4916-81ef-5e3b4494b810-catalog-content\") pod \"redhat-operators-bb5nh\" (UID: \"6c52987f-80e4-4916-81ef-5e3b4494b810\") " pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.317570 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmfht\" (UniqueName: \"kubernetes.io/projected/6c52987f-80e4-4916-81ef-5e3b4494b810-kube-api-access-pmfht\") pod \"redhat-operators-bb5nh\" (UID: \"6c52987f-80e4-4916-81ef-5e3b4494b810\") " pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:18:57 crc kubenswrapper[4867]: I1006 14:18:57.412530 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:18:58 crc kubenswrapper[4867]: I1006 14:18:58.622207 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bb5nh"] Oct 06 14:18:59 crc kubenswrapper[4867]: I1006 14:18:59.469768 4867 generic.go:334] "Generic (PLEG): container finished" podID="6c52987f-80e4-4916-81ef-5e3b4494b810" containerID="f4d101f296eab1e0d86e7b4b4976b4ff7ef6cc30ec0521738132d49b21d0fdbe" exitCode=0 Oct 06 14:18:59 crc kubenswrapper[4867]: I1006 14:18:59.469892 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bb5nh" event={"ID":"6c52987f-80e4-4916-81ef-5e3b4494b810","Type":"ContainerDied","Data":"f4d101f296eab1e0d86e7b4b4976b4ff7ef6cc30ec0521738132d49b21d0fdbe"} Oct 06 14:18:59 crc kubenswrapper[4867]: I1006 14:18:59.470153 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bb5nh" event={"ID":"6c52987f-80e4-4916-81ef-5e3b4494b810","Type":"ContainerStarted","Data":"4d59c99f4441040bea754b33ed941d1c1e4baf8562c9ce385079f44850f142a4"} Oct 06 14:18:59 crc kubenswrapper[4867]: I1006 14:18:59.471936 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 14:19:01 crc kubenswrapper[4867]: I1006 14:19:01.491555 4867 generic.go:334] "Generic (PLEG): container finished" podID="6c52987f-80e4-4916-81ef-5e3b4494b810" containerID="ac501a4c404c15fac6622e517c658c36395a56a9a539aecbaf25e84f29726e93" exitCode=0 Oct 06 14:19:01 crc kubenswrapper[4867]: I1006 14:19:01.491747 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bb5nh" event={"ID":"6c52987f-80e4-4916-81ef-5e3b4494b810","Type":"ContainerDied","Data":"ac501a4c404c15fac6622e517c658c36395a56a9a539aecbaf25e84f29726e93"} Oct 06 14:19:02 crc kubenswrapper[4867]: I1006 14:19:02.507139 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bb5nh" event={"ID":"6c52987f-80e4-4916-81ef-5e3b4494b810","Type":"ContainerStarted","Data":"451db8046c77a6a556dcefc28126912076f7117c712dc339b32467fbe84bf57d"} Oct 06 14:19:02 crc kubenswrapper[4867]: I1006 14:19:02.543658 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bb5nh" podStartSLOduration=2.934883197 podStartE2EDuration="5.543620122s" podCreationTimestamp="2025-10-06 14:18:57 +0000 UTC" firstStartedPulling="2025-10-06 14:18:59.471735813 +0000 UTC m=+4518.929683957" lastFinishedPulling="2025-10-06 14:19:02.080472738 +0000 UTC m=+4521.538420882" observedRunningTime="2025-10-06 14:19:02.525200324 +0000 UTC m=+4521.983148468" watchObservedRunningTime="2025-10-06 14:19:02.543620122 +0000 UTC m=+4522.001568266" Oct 06 14:19:07 crc kubenswrapper[4867]: I1006 14:19:07.412979 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:19:07 crc kubenswrapper[4867]: I1006 14:19:07.413969 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:19:07 crc kubenswrapper[4867]: I1006 14:19:07.469019 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:19:07 crc kubenswrapper[4867]: I1006 14:19:07.614602 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:19:07 crc kubenswrapper[4867]: I1006 14:19:07.713666 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bb5nh"] Oct 06 14:19:09 crc kubenswrapper[4867]: I1006 14:19:09.592461 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bb5nh" podUID="6c52987f-80e4-4916-81ef-5e3b4494b810" containerName="registry-server" containerID="cri-o://451db8046c77a6a556dcefc28126912076f7117c712dc339b32467fbe84bf57d" gracePeriod=2 Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.073262 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.191483 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmfht\" (UniqueName: \"kubernetes.io/projected/6c52987f-80e4-4916-81ef-5e3b4494b810-kube-api-access-pmfht\") pod \"6c52987f-80e4-4916-81ef-5e3b4494b810\" (UID: \"6c52987f-80e4-4916-81ef-5e3b4494b810\") " Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.191565 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c52987f-80e4-4916-81ef-5e3b4494b810-utilities\") pod \"6c52987f-80e4-4916-81ef-5e3b4494b810\" (UID: \"6c52987f-80e4-4916-81ef-5e3b4494b810\") " Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.191608 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c52987f-80e4-4916-81ef-5e3b4494b810-catalog-content\") pod \"6c52987f-80e4-4916-81ef-5e3b4494b810\" (UID: \"6c52987f-80e4-4916-81ef-5e3b4494b810\") " Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.192604 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c52987f-80e4-4916-81ef-5e3b4494b810-utilities" (OuterVolumeSpecName: "utilities") pod "6c52987f-80e4-4916-81ef-5e3b4494b810" (UID: "6c52987f-80e4-4916-81ef-5e3b4494b810"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.199355 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c52987f-80e4-4916-81ef-5e3b4494b810-kube-api-access-pmfht" (OuterVolumeSpecName: "kube-api-access-pmfht") pod "6c52987f-80e4-4916-81ef-5e3b4494b810" (UID: "6c52987f-80e4-4916-81ef-5e3b4494b810"). InnerVolumeSpecName "kube-api-access-pmfht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.283816 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c52987f-80e4-4916-81ef-5e3b4494b810-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c52987f-80e4-4916-81ef-5e3b4494b810" (UID: "6c52987f-80e4-4916-81ef-5e3b4494b810"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.294679 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmfht\" (UniqueName: \"kubernetes.io/projected/6c52987f-80e4-4916-81ef-5e3b4494b810-kube-api-access-pmfht\") on node \"crc\" DevicePath \"\"" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.294704 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c52987f-80e4-4916-81ef-5e3b4494b810-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.294714 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c52987f-80e4-4916-81ef-5e3b4494b810-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.604390 4867 generic.go:334] "Generic (PLEG): container finished" podID="6c52987f-80e4-4916-81ef-5e3b4494b810" containerID="451db8046c77a6a556dcefc28126912076f7117c712dc339b32467fbe84bf57d" exitCode=0 Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.604480 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bb5nh" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.604502 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bb5nh" event={"ID":"6c52987f-80e4-4916-81ef-5e3b4494b810","Type":"ContainerDied","Data":"451db8046c77a6a556dcefc28126912076f7117c712dc339b32467fbe84bf57d"} Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.605575 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bb5nh" event={"ID":"6c52987f-80e4-4916-81ef-5e3b4494b810","Type":"ContainerDied","Data":"4d59c99f4441040bea754b33ed941d1c1e4baf8562c9ce385079f44850f142a4"} Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.605600 4867 scope.go:117] "RemoveContainer" containerID="451db8046c77a6a556dcefc28126912076f7117c712dc339b32467fbe84bf57d" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.643053 4867 scope.go:117] "RemoveContainer" containerID="ac501a4c404c15fac6622e517c658c36395a56a9a539aecbaf25e84f29726e93" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.643742 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bb5nh"] Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.654922 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bb5nh"] Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.676082 4867 scope.go:117] "RemoveContainer" containerID="f4d101f296eab1e0d86e7b4b4976b4ff7ef6cc30ec0521738132d49b21d0fdbe" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.719184 4867 scope.go:117] "RemoveContainer" containerID="451db8046c77a6a556dcefc28126912076f7117c712dc339b32467fbe84bf57d" Oct 06 14:19:10 crc kubenswrapper[4867]: E1006 14:19:10.720051 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"451db8046c77a6a556dcefc28126912076f7117c712dc339b32467fbe84bf57d\": container with ID starting with 451db8046c77a6a556dcefc28126912076f7117c712dc339b32467fbe84bf57d not found: ID does not exist" containerID="451db8046c77a6a556dcefc28126912076f7117c712dc339b32467fbe84bf57d" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.720098 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"451db8046c77a6a556dcefc28126912076f7117c712dc339b32467fbe84bf57d"} err="failed to get container status \"451db8046c77a6a556dcefc28126912076f7117c712dc339b32467fbe84bf57d\": rpc error: code = NotFound desc = could not find container \"451db8046c77a6a556dcefc28126912076f7117c712dc339b32467fbe84bf57d\": container with ID starting with 451db8046c77a6a556dcefc28126912076f7117c712dc339b32467fbe84bf57d not found: ID does not exist" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.720140 4867 scope.go:117] "RemoveContainer" containerID="ac501a4c404c15fac6622e517c658c36395a56a9a539aecbaf25e84f29726e93" Oct 06 14:19:10 crc kubenswrapper[4867]: E1006 14:19:10.720655 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac501a4c404c15fac6622e517c658c36395a56a9a539aecbaf25e84f29726e93\": container with ID starting with ac501a4c404c15fac6622e517c658c36395a56a9a539aecbaf25e84f29726e93 not found: ID does not exist" containerID="ac501a4c404c15fac6622e517c658c36395a56a9a539aecbaf25e84f29726e93" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.720708 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac501a4c404c15fac6622e517c658c36395a56a9a539aecbaf25e84f29726e93"} err="failed to get container status \"ac501a4c404c15fac6622e517c658c36395a56a9a539aecbaf25e84f29726e93\": rpc error: code = NotFound desc = could not find container \"ac501a4c404c15fac6622e517c658c36395a56a9a539aecbaf25e84f29726e93\": container with ID starting with ac501a4c404c15fac6622e517c658c36395a56a9a539aecbaf25e84f29726e93 not found: ID does not exist" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.720751 4867 scope.go:117] "RemoveContainer" containerID="f4d101f296eab1e0d86e7b4b4976b4ff7ef6cc30ec0521738132d49b21d0fdbe" Oct 06 14:19:10 crc kubenswrapper[4867]: E1006 14:19:10.722030 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4d101f296eab1e0d86e7b4b4976b4ff7ef6cc30ec0521738132d49b21d0fdbe\": container with ID starting with f4d101f296eab1e0d86e7b4b4976b4ff7ef6cc30ec0521738132d49b21d0fdbe not found: ID does not exist" containerID="f4d101f296eab1e0d86e7b4b4976b4ff7ef6cc30ec0521738132d49b21d0fdbe" Oct 06 14:19:10 crc kubenswrapper[4867]: I1006 14:19:10.722156 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4d101f296eab1e0d86e7b4b4976b4ff7ef6cc30ec0521738132d49b21d0fdbe"} err="failed to get container status \"f4d101f296eab1e0d86e7b4b4976b4ff7ef6cc30ec0521738132d49b21d0fdbe\": rpc error: code = NotFound desc = could not find container \"f4d101f296eab1e0d86e7b4b4976b4ff7ef6cc30ec0521738132d49b21d0fdbe\": container with ID starting with f4d101f296eab1e0d86e7b4b4976b4ff7ef6cc30ec0521738132d49b21d0fdbe not found: ID does not exist" Oct 06 14:19:11 crc kubenswrapper[4867]: I1006 14:19:11.240317 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c52987f-80e4-4916-81ef-5e3b4494b810" path="/var/lib/kubelet/pods/6c52987f-80e4-4916-81ef-5e3b4494b810/volumes" Oct 06 14:19:12 crc kubenswrapper[4867]: I1006 14:19:12.874149 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:19:12 crc kubenswrapper[4867]: I1006 14:19:12.874752 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:19:42 crc kubenswrapper[4867]: I1006 14:19:42.874298 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:19:42 crc kubenswrapper[4867]: I1006 14:19:42.874876 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:19:42 crc kubenswrapper[4867]: I1006 14:19:42.874927 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 14:19:42 crc kubenswrapper[4867]: I1006 14:19:42.876375 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 14:19:42 crc kubenswrapper[4867]: I1006 14:19:42.876439 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" gracePeriod=600 Oct 06 14:19:42 crc kubenswrapper[4867]: E1006 14:19:42.999342 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:19:43 crc kubenswrapper[4867]: I1006 14:19:43.991215 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" exitCode=0 Oct 06 14:19:43 crc kubenswrapper[4867]: I1006 14:19:43.991303 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246"} Oct 06 14:19:43 crc kubenswrapper[4867]: I1006 14:19:43.991394 4867 scope.go:117] "RemoveContainer" containerID="d5d10b7a2e5392a1994d4a7d473042a2aca3aaea20b30126de41dd23a2674699" Oct 06 14:19:43 crc kubenswrapper[4867]: I1006 14:19:43.992530 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:19:43 crc kubenswrapper[4867]: E1006 14:19:43.992971 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:19:55 crc kubenswrapper[4867]: I1006 14:19:55.222812 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:19:55 crc kubenswrapper[4867]: E1006 14:19:55.223652 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:20:06 crc kubenswrapper[4867]: I1006 14:20:06.221329 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:20:06 crc kubenswrapper[4867]: E1006 14:20:06.222285 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:20:18 crc kubenswrapper[4867]: I1006 14:20:18.221081 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:20:18 crc kubenswrapper[4867]: E1006 14:20:18.222448 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:20:31 crc kubenswrapper[4867]: I1006 14:20:31.230856 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:20:31 crc kubenswrapper[4867]: E1006 14:20:31.231823 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:20:43 crc kubenswrapper[4867]: I1006 14:20:43.221701 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:20:43 crc kubenswrapper[4867]: E1006 14:20:43.224142 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:20:54 crc kubenswrapper[4867]: I1006 14:20:54.227515 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:20:54 crc kubenswrapper[4867]: E1006 14:20:54.228990 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:21:05 crc kubenswrapper[4867]: I1006 14:21:05.222868 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:21:05 crc kubenswrapper[4867]: E1006 14:21:05.223652 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:21:19 crc kubenswrapper[4867]: I1006 14:21:19.221443 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:21:19 crc kubenswrapper[4867]: E1006 14:21:19.222451 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:21:33 crc kubenswrapper[4867]: I1006 14:21:33.221557 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:21:33 crc kubenswrapper[4867]: E1006 14:21:33.222370 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:21:37 crc kubenswrapper[4867]: E1006 14:21:37.202742 4867 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.198:57564->38.102.83.198:45409: write tcp 38.102.83.198:57564->38.102.83.198:45409: write: broken pipe Oct 06 14:21:42 crc kubenswrapper[4867]: E1006 14:21:42.011310 4867 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.198:57720->38.102.83.198:45409: write tcp 38.102.83.198:57720->38.102.83.198:45409: write: broken pipe Oct 06 14:21:44 crc kubenswrapper[4867]: I1006 14:21:44.222281 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:21:44 crc kubenswrapper[4867]: E1006 14:21:44.223141 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:21:58 crc kubenswrapper[4867]: I1006 14:21:58.221399 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:21:58 crc kubenswrapper[4867]: E1006 14:21:58.222390 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:22:09 crc kubenswrapper[4867]: I1006 14:22:09.222584 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:22:09 crc kubenswrapper[4867]: E1006 14:22:09.223756 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:22:23 crc kubenswrapper[4867]: I1006 14:22:23.222390 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:22:23 crc kubenswrapper[4867]: E1006 14:22:23.223540 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:22:35 crc kubenswrapper[4867]: I1006 14:22:35.223530 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:22:35 crc kubenswrapper[4867]: E1006 14:22:35.224429 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:22:48 crc kubenswrapper[4867]: I1006 14:22:48.221410 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:22:48 crc kubenswrapper[4867]: E1006 14:22:48.222517 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:23:01 crc kubenswrapper[4867]: I1006 14:23:01.228198 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:23:01 crc kubenswrapper[4867]: E1006 14:23:01.229557 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:23:12 crc kubenswrapper[4867]: I1006 14:23:12.222056 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:23:12 crc kubenswrapper[4867]: E1006 14:23:12.223191 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.001224 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6wkjx"] Oct 06 14:23:24 crc kubenswrapper[4867]: E1006 14:23:24.002443 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c52987f-80e4-4916-81ef-5e3b4494b810" containerName="extract-content" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.002458 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c52987f-80e4-4916-81ef-5e3b4494b810" containerName="extract-content" Oct 06 14:23:24 crc kubenswrapper[4867]: E1006 14:23:24.002479 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c52987f-80e4-4916-81ef-5e3b4494b810" containerName="registry-server" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.002485 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c52987f-80e4-4916-81ef-5e3b4494b810" containerName="registry-server" Oct 06 14:23:24 crc kubenswrapper[4867]: E1006 14:23:24.002514 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c52987f-80e4-4916-81ef-5e3b4494b810" containerName="extract-utilities" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.002522 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c52987f-80e4-4916-81ef-5e3b4494b810" containerName="extract-utilities" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.002709 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c52987f-80e4-4916-81ef-5e3b4494b810" containerName="registry-server" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.006144 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.013836 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6wkjx"] Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.085213 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-utilities\") pod \"community-operators-6wkjx\" (UID: \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\") " pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.085825 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-catalog-content\") pod \"community-operators-6wkjx\" (UID: \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\") " pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.085937 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cghbx\" (UniqueName: \"kubernetes.io/projected/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-kube-api-access-cghbx\") pod \"community-operators-6wkjx\" (UID: \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\") " pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.188715 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-catalog-content\") pod \"community-operators-6wkjx\" (UID: \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\") " pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.188811 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cghbx\" (UniqueName: \"kubernetes.io/projected/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-kube-api-access-cghbx\") pod \"community-operators-6wkjx\" (UID: \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\") " pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.188978 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-utilities\") pod \"community-operators-6wkjx\" (UID: \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\") " pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.189460 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-catalog-content\") pod \"community-operators-6wkjx\" (UID: \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\") " pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.189786 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-utilities\") pod \"community-operators-6wkjx\" (UID: \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\") " pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.222051 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:23:24 crc kubenswrapper[4867]: E1006 14:23:24.222635 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.234707 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cghbx\" (UniqueName: \"kubernetes.io/projected/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-kube-api-access-cghbx\") pod \"community-operators-6wkjx\" (UID: \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\") " pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.344625 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:24 crc kubenswrapper[4867]: I1006 14:23:24.894627 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6wkjx"] Oct 06 14:23:24 crc kubenswrapper[4867]: W1006 14:23:24.898637 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9cbaee4_8ef6_4e8a_8696_ad086f7f2989.slice/crio-c14db618b9dc51a2648a3b2f6b1764430864920c738999e5b959b3031216671e WatchSource:0}: Error finding container c14db618b9dc51a2648a3b2f6b1764430864920c738999e5b959b3031216671e: Status 404 returned error can't find the container with id c14db618b9dc51a2648a3b2f6b1764430864920c738999e5b959b3031216671e Oct 06 14:23:25 crc kubenswrapper[4867]: I1006 14:23:25.394605 4867 generic.go:334] "Generic (PLEG): container finished" podID="f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" containerID="d097106b79cfc44046df5e1cbce64c60e5d93914219bd44bceb45b7647bd69d6" exitCode=0 Oct 06 14:23:25 crc kubenswrapper[4867]: I1006 14:23:25.394843 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wkjx" event={"ID":"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989","Type":"ContainerDied","Data":"d097106b79cfc44046df5e1cbce64c60e5d93914219bd44bceb45b7647bd69d6"} Oct 06 14:23:25 crc kubenswrapper[4867]: I1006 14:23:25.395091 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wkjx" event={"ID":"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989","Type":"ContainerStarted","Data":"c14db618b9dc51a2648a3b2f6b1764430864920c738999e5b959b3031216671e"} Oct 06 14:23:26 crc kubenswrapper[4867]: I1006 14:23:26.409505 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wkjx" event={"ID":"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989","Type":"ContainerStarted","Data":"70f343697f180442b4de47ef2b90774952f57cf826a8ba032bebf0aab7b889ba"} Oct 06 14:23:27 crc kubenswrapper[4867]: I1006 14:23:27.427125 4867 generic.go:334] "Generic (PLEG): container finished" podID="f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" containerID="70f343697f180442b4de47ef2b90774952f57cf826a8ba032bebf0aab7b889ba" exitCode=0 Oct 06 14:23:27 crc kubenswrapper[4867]: I1006 14:23:27.427284 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wkjx" event={"ID":"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989","Type":"ContainerDied","Data":"70f343697f180442b4de47ef2b90774952f57cf826a8ba032bebf0aab7b889ba"} Oct 06 14:23:28 crc kubenswrapper[4867]: I1006 14:23:28.450770 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wkjx" event={"ID":"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989","Type":"ContainerStarted","Data":"4d502aeaf0e90d4cfbee8dfb3608ff39f5aa8b658c401a72eae82fb38396c90c"} Oct 06 14:23:28 crc kubenswrapper[4867]: I1006 14:23:28.475152 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6wkjx" podStartSLOduration=2.958072052 podStartE2EDuration="5.47513472s" podCreationTimestamp="2025-10-06 14:23:23 +0000 UTC" firstStartedPulling="2025-10-06 14:23:25.398971096 +0000 UTC m=+4784.856919290" lastFinishedPulling="2025-10-06 14:23:27.916033814 +0000 UTC m=+4787.373981958" observedRunningTime="2025-10-06 14:23:28.471117701 +0000 UTC m=+4787.929065855" watchObservedRunningTime="2025-10-06 14:23:28.47513472 +0000 UTC m=+4787.933082864" Oct 06 14:23:34 crc kubenswrapper[4867]: I1006 14:23:34.344772 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:34 crc kubenswrapper[4867]: I1006 14:23:34.345594 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:34 crc kubenswrapper[4867]: I1006 14:23:34.409141 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:34 crc kubenswrapper[4867]: I1006 14:23:34.571438 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:34 crc kubenswrapper[4867]: I1006 14:23:34.646109 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6wkjx"] Oct 06 14:23:36 crc kubenswrapper[4867]: I1006 14:23:36.544113 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6wkjx" podUID="f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" containerName="registry-server" containerID="cri-o://4d502aeaf0e90d4cfbee8dfb3608ff39f5aa8b658c401a72eae82fb38396c90c" gracePeriod=2 Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.196044 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.398715 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-catalog-content\") pod \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\" (UID: \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\") " Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.399561 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cghbx\" (UniqueName: \"kubernetes.io/projected/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-kube-api-access-cghbx\") pod \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\" (UID: \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\") " Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.400763 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-utilities" (OuterVolumeSpecName: "utilities") pod "f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" (UID: "f9cbaee4-8ef6-4e8a-8696-ad086f7f2989"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.399840 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-utilities\") pod \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\" (UID: \"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989\") " Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.402868 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.410700 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-kube-api-access-cghbx" (OuterVolumeSpecName: "kube-api-access-cghbx") pod "f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" (UID: "f9cbaee4-8ef6-4e8a-8696-ad086f7f2989"). InnerVolumeSpecName "kube-api-access-cghbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.465172 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" (UID: "f9cbaee4-8ef6-4e8a-8696-ad086f7f2989"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.505591 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.505633 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cghbx\" (UniqueName: \"kubernetes.io/projected/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989-kube-api-access-cghbx\") on node \"crc\" DevicePath \"\"" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.558709 4867 generic.go:334] "Generic (PLEG): container finished" podID="f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" containerID="4d502aeaf0e90d4cfbee8dfb3608ff39f5aa8b658c401a72eae82fb38396c90c" exitCode=0 Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.558776 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wkjx" event={"ID":"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989","Type":"ContainerDied","Data":"4d502aeaf0e90d4cfbee8dfb3608ff39f5aa8b658c401a72eae82fb38396c90c"} Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.558817 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6wkjx" event={"ID":"f9cbaee4-8ef6-4e8a-8696-ad086f7f2989","Type":"ContainerDied","Data":"c14db618b9dc51a2648a3b2f6b1764430864920c738999e5b959b3031216671e"} Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.558840 4867 scope.go:117] "RemoveContainer" containerID="4d502aeaf0e90d4cfbee8dfb3608ff39f5aa8b658c401a72eae82fb38396c90c" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.559043 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6wkjx" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.591163 4867 scope.go:117] "RemoveContainer" containerID="70f343697f180442b4de47ef2b90774952f57cf826a8ba032bebf0aab7b889ba" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.601114 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6wkjx"] Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.610279 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6wkjx"] Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.649799 4867 scope.go:117] "RemoveContainer" containerID="d097106b79cfc44046df5e1cbce64c60e5d93914219bd44bceb45b7647bd69d6" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.686866 4867 scope.go:117] "RemoveContainer" containerID="4d502aeaf0e90d4cfbee8dfb3608ff39f5aa8b658c401a72eae82fb38396c90c" Oct 06 14:23:37 crc kubenswrapper[4867]: E1006 14:23:37.687813 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d502aeaf0e90d4cfbee8dfb3608ff39f5aa8b658c401a72eae82fb38396c90c\": container with ID starting with 4d502aeaf0e90d4cfbee8dfb3608ff39f5aa8b658c401a72eae82fb38396c90c not found: ID does not exist" containerID="4d502aeaf0e90d4cfbee8dfb3608ff39f5aa8b658c401a72eae82fb38396c90c" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.687933 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d502aeaf0e90d4cfbee8dfb3608ff39f5aa8b658c401a72eae82fb38396c90c"} err="failed to get container status \"4d502aeaf0e90d4cfbee8dfb3608ff39f5aa8b658c401a72eae82fb38396c90c\": rpc error: code = NotFound desc = could not find container \"4d502aeaf0e90d4cfbee8dfb3608ff39f5aa8b658c401a72eae82fb38396c90c\": container with ID starting with 4d502aeaf0e90d4cfbee8dfb3608ff39f5aa8b658c401a72eae82fb38396c90c not found: ID does not exist" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.688013 4867 scope.go:117] "RemoveContainer" containerID="70f343697f180442b4de47ef2b90774952f57cf826a8ba032bebf0aab7b889ba" Oct 06 14:23:37 crc kubenswrapper[4867]: E1006 14:23:37.689009 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70f343697f180442b4de47ef2b90774952f57cf826a8ba032bebf0aab7b889ba\": container with ID starting with 70f343697f180442b4de47ef2b90774952f57cf826a8ba032bebf0aab7b889ba not found: ID does not exist" containerID="70f343697f180442b4de47ef2b90774952f57cf826a8ba032bebf0aab7b889ba" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.689041 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f343697f180442b4de47ef2b90774952f57cf826a8ba032bebf0aab7b889ba"} err="failed to get container status \"70f343697f180442b4de47ef2b90774952f57cf826a8ba032bebf0aab7b889ba\": rpc error: code = NotFound desc = could not find container \"70f343697f180442b4de47ef2b90774952f57cf826a8ba032bebf0aab7b889ba\": container with ID starting with 70f343697f180442b4de47ef2b90774952f57cf826a8ba032bebf0aab7b889ba not found: ID does not exist" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.689060 4867 scope.go:117] "RemoveContainer" containerID="d097106b79cfc44046df5e1cbce64c60e5d93914219bd44bceb45b7647bd69d6" Oct 06 14:23:37 crc kubenswrapper[4867]: E1006 14:23:37.689684 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d097106b79cfc44046df5e1cbce64c60e5d93914219bd44bceb45b7647bd69d6\": container with ID starting with d097106b79cfc44046df5e1cbce64c60e5d93914219bd44bceb45b7647bd69d6 not found: ID does not exist" containerID="d097106b79cfc44046df5e1cbce64c60e5d93914219bd44bceb45b7647bd69d6" Oct 06 14:23:37 crc kubenswrapper[4867]: I1006 14:23:37.689720 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d097106b79cfc44046df5e1cbce64c60e5d93914219bd44bceb45b7647bd69d6"} err="failed to get container status \"d097106b79cfc44046df5e1cbce64c60e5d93914219bd44bceb45b7647bd69d6\": rpc error: code = NotFound desc = could not find container \"d097106b79cfc44046df5e1cbce64c60e5d93914219bd44bceb45b7647bd69d6\": container with ID starting with d097106b79cfc44046df5e1cbce64c60e5d93914219bd44bceb45b7647bd69d6 not found: ID does not exist" Oct 06 14:23:38 crc kubenswrapper[4867]: I1006 14:23:38.222490 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:23:38 crc kubenswrapper[4867]: E1006 14:23:38.222936 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:23:39 crc kubenswrapper[4867]: I1006 14:23:39.236059 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" path="/var/lib/kubelet/pods/f9cbaee4-8ef6-4e8a-8696-ad086f7f2989/volumes" Oct 06 14:23:51 crc kubenswrapper[4867]: I1006 14:23:51.222755 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:23:51 crc kubenswrapper[4867]: E1006 14:23:51.223628 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:24:04 crc kubenswrapper[4867]: I1006 14:24:04.222906 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:24:04 crc kubenswrapper[4867]: E1006 14:24:04.224314 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:24:18 crc kubenswrapper[4867]: I1006 14:24:18.221400 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:24:18 crc kubenswrapper[4867]: E1006 14:24:18.222309 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:24:30 crc kubenswrapper[4867]: I1006 14:24:30.222534 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:24:30 crc kubenswrapper[4867]: E1006 14:24:30.223580 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.293380 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jnn44"] Oct 06 14:24:32 crc kubenswrapper[4867]: E1006 14:24:32.294504 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" containerName="extract-utilities" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.294523 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" containerName="extract-utilities" Oct 06 14:24:32 crc kubenswrapper[4867]: E1006 14:24:32.294533 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" containerName="extract-content" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.294540 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" containerName="extract-content" Oct 06 14:24:32 crc kubenswrapper[4867]: E1006 14:24:32.294561 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" containerName="registry-server" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.294568 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" containerName="registry-server" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.294800 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9cbaee4-8ef6-4e8a-8696-ad086f7f2989" containerName="registry-server" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.296943 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.317659 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnn44"] Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.383377 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/794ee00a-4589-46d3-9821-550a02d5caca-catalog-content\") pod \"redhat-marketplace-jnn44\" (UID: \"794ee00a-4589-46d3-9821-550a02d5caca\") " pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.383471 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94s2c\" (UniqueName: \"kubernetes.io/projected/794ee00a-4589-46d3-9821-550a02d5caca-kube-api-access-94s2c\") pod \"redhat-marketplace-jnn44\" (UID: \"794ee00a-4589-46d3-9821-550a02d5caca\") " pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.383839 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/794ee00a-4589-46d3-9821-550a02d5caca-utilities\") pod \"redhat-marketplace-jnn44\" (UID: \"794ee00a-4589-46d3-9821-550a02d5caca\") " pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.487027 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/794ee00a-4589-46d3-9821-550a02d5caca-utilities\") pod \"redhat-marketplace-jnn44\" (UID: \"794ee00a-4589-46d3-9821-550a02d5caca\") " pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.487725 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/794ee00a-4589-46d3-9821-550a02d5caca-utilities\") pod \"redhat-marketplace-jnn44\" (UID: \"794ee00a-4589-46d3-9821-550a02d5caca\") " pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.487739 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/794ee00a-4589-46d3-9821-550a02d5caca-catalog-content\") pod \"redhat-marketplace-jnn44\" (UID: \"794ee00a-4589-46d3-9821-550a02d5caca\") " pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.488036 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94s2c\" (UniqueName: \"kubernetes.io/projected/794ee00a-4589-46d3-9821-550a02d5caca-kube-api-access-94s2c\") pod \"redhat-marketplace-jnn44\" (UID: \"794ee00a-4589-46d3-9821-550a02d5caca\") " pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.488301 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/794ee00a-4589-46d3-9821-550a02d5caca-catalog-content\") pod \"redhat-marketplace-jnn44\" (UID: \"794ee00a-4589-46d3-9821-550a02d5caca\") " pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.511546 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94s2c\" (UniqueName: \"kubernetes.io/projected/794ee00a-4589-46d3-9821-550a02d5caca-kube-api-access-94s2c\") pod \"redhat-marketplace-jnn44\" (UID: \"794ee00a-4589-46d3-9821-550a02d5caca\") " pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:32 crc kubenswrapper[4867]: I1006 14:24:32.629552 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:33 crc kubenswrapper[4867]: I1006 14:24:33.711160 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnn44"] Oct 06 14:24:33 crc kubenswrapper[4867]: W1006 14:24:33.726561 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod794ee00a_4589_46d3_9821_550a02d5caca.slice/crio-84661b198fccaa3fe315507efec73785b229fcd9211a2db22f848ccf2be83640 WatchSource:0}: Error finding container 84661b198fccaa3fe315507efec73785b229fcd9211a2db22f848ccf2be83640: Status 404 returned error can't find the container with id 84661b198fccaa3fe315507efec73785b229fcd9211a2db22f848ccf2be83640 Oct 06 14:24:34 crc kubenswrapper[4867]: I1006 14:24:34.198445 4867 generic.go:334] "Generic (PLEG): container finished" podID="794ee00a-4589-46d3-9821-550a02d5caca" containerID="103b145a1165b7cd88ea843297c289a2e1548657462bac5297bc8f80e5180869" exitCode=0 Oct 06 14:24:34 crc kubenswrapper[4867]: I1006 14:24:34.198546 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnn44" event={"ID":"794ee00a-4589-46d3-9821-550a02d5caca","Type":"ContainerDied","Data":"103b145a1165b7cd88ea843297c289a2e1548657462bac5297bc8f80e5180869"} Oct 06 14:24:34 crc kubenswrapper[4867]: I1006 14:24:34.198858 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnn44" event={"ID":"794ee00a-4589-46d3-9821-550a02d5caca","Type":"ContainerStarted","Data":"84661b198fccaa3fe315507efec73785b229fcd9211a2db22f848ccf2be83640"} Oct 06 14:24:34 crc kubenswrapper[4867]: I1006 14:24:34.201595 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 14:24:36 crc kubenswrapper[4867]: I1006 14:24:36.234938 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnn44" event={"ID":"794ee00a-4589-46d3-9821-550a02d5caca","Type":"ContainerStarted","Data":"3d74f91f89d8f9eb5dad764bb4973dd189cb2be2035932bf0c839c1b982d6051"} Oct 06 14:24:37 crc kubenswrapper[4867]: I1006 14:24:37.248365 4867 generic.go:334] "Generic (PLEG): container finished" podID="794ee00a-4589-46d3-9821-550a02d5caca" containerID="3d74f91f89d8f9eb5dad764bb4973dd189cb2be2035932bf0c839c1b982d6051" exitCode=0 Oct 06 14:24:37 crc kubenswrapper[4867]: I1006 14:24:37.248471 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnn44" event={"ID":"794ee00a-4589-46d3-9821-550a02d5caca","Type":"ContainerDied","Data":"3d74f91f89d8f9eb5dad764bb4973dd189cb2be2035932bf0c839c1b982d6051"} Oct 06 14:24:38 crc kubenswrapper[4867]: I1006 14:24:38.266348 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnn44" event={"ID":"794ee00a-4589-46d3-9821-550a02d5caca","Type":"ContainerStarted","Data":"e9cbc75827870351f64571a667566933420ca6bd698b6893dba6c179e73cad01"} Oct 06 14:24:38 crc kubenswrapper[4867]: I1006 14:24:38.289213 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jnn44" podStartSLOduration=2.581494385 podStartE2EDuration="6.289193513s" podCreationTimestamp="2025-10-06 14:24:32 +0000 UTC" firstStartedPulling="2025-10-06 14:24:34.200980344 +0000 UTC m=+4853.658928498" lastFinishedPulling="2025-10-06 14:24:37.908679472 +0000 UTC m=+4857.366627626" observedRunningTime="2025-10-06 14:24:38.287189939 +0000 UTC m=+4857.745138103" watchObservedRunningTime="2025-10-06 14:24:38.289193513 +0000 UTC m=+4857.747141657" Oct 06 14:24:42 crc kubenswrapper[4867]: I1006 14:24:42.631185 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:42 crc kubenswrapper[4867]: I1006 14:24:42.632583 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:42 crc kubenswrapper[4867]: I1006 14:24:42.694509 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:43 crc kubenswrapper[4867]: I1006 14:24:43.374457 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:43 crc kubenswrapper[4867]: I1006 14:24:43.437446 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnn44"] Oct 06 14:24:45 crc kubenswrapper[4867]: I1006 14:24:45.222319 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:24:45 crc kubenswrapper[4867]: I1006 14:24:45.345369 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jnn44" podUID="794ee00a-4589-46d3-9821-550a02d5caca" containerName="registry-server" containerID="cri-o://e9cbc75827870351f64571a667566933420ca6bd698b6893dba6c179e73cad01" gracePeriod=2 Oct 06 14:24:45 crc kubenswrapper[4867]: I1006 14:24:45.830837 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:45 crc kubenswrapper[4867]: I1006 14:24:45.951513 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/794ee00a-4589-46d3-9821-550a02d5caca-utilities\") pod \"794ee00a-4589-46d3-9821-550a02d5caca\" (UID: \"794ee00a-4589-46d3-9821-550a02d5caca\") " Oct 06 14:24:45 crc kubenswrapper[4867]: I1006 14:24:45.951746 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94s2c\" (UniqueName: \"kubernetes.io/projected/794ee00a-4589-46d3-9821-550a02d5caca-kube-api-access-94s2c\") pod \"794ee00a-4589-46d3-9821-550a02d5caca\" (UID: \"794ee00a-4589-46d3-9821-550a02d5caca\") " Oct 06 14:24:45 crc kubenswrapper[4867]: I1006 14:24:45.951853 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/794ee00a-4589-46d3-9821-550a02d5caca-catalog-content\") pod \"794ee00a-4589-46d3-9821-550a02d5caca\" (UID: \"794ee00a-4589-46d3-9821-550a02d5caca\") " Oct 06 14:24:45 crc kubenswrapper[4867]: I1006 14:24:45.952828 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/794ee00a-4589-46d3-9821-550a02d5caca-utilities" (OuterVolumeSpecName: "utilities") pod "794ee00a-4589-46d3-9821-550a02d5caca" (UID: "794ee00a-4589-46d3-9821-550a02d5caca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:24:45 crc kubenswrapper[4867]: I1006 14:24:45.959445 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794ee00a-4589-46d3-9821-550a02d5caca-kube-api-access-94s2c" (OuterVolumeSpecName: "kube-api-access-94s2c") pod "794ee00a-4589-46d3-9821-550a02d5caca" (UID: "794ee00a-4589-46d3-9821-550a02d5caca"). InnerVolumeSpecName "kube-api-access-94s2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:24:45 crc kubenswrapper[4867]: I1006 14:24:45.966171 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/794ee00a-4589-46d3-9821-550a02d5caca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "794ee00a-4589-46d3-9821-550a02d5caca" (UID: "794ee00a-4589-46d3-9821-550a02d5caca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.054679 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94s2c\" (UniqueName: \"kubernetes.io/projected/794ee00a-4589-46d3-9821-550a02d5caca-kube-api-access-94s2c\") on node \"crc\" DevicePath \"\"" Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.054715 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/794ee00a-4589-46d3-9821-550a02d5caca-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.054727 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/794ee00a-4589-46d3-9821-550a02d5caca-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.375216 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"5962e81fccdfe31ac3ecff5cdae36fe695f010cdbe5374fcd4f1b6775b5533a1"} Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.381351 4867 generic.go:334] "Generic (PLEG): container finished" podID="794ee00a-4589-46d3-9821-550a02d5caca" containerID="e9cbc75827870351f64571a667566933420ca6bd698b6893dba6c179e73cad01" exitCode=0 Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.381422 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnn44" event={"ID":"794ee00a-4589-46d3-9821-550a02d5caca","Type":"ContainerDied","Data":"e9cbc75827870351f64571a667566933420ca6bd698b6893dba6c179e73cad01"} Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.381465 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnn44" event={"ID":"794ee00a-4589-46d3-9821-550a02d5caca","Type":"ContainerDied","Data":"84661b198fccaa3fe315507efec73785b229fcd9211a2db22f848ccf2be83640"} Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.381673 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnn44" Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.382573 4867 scope.go:117] "RemoveContainer" containerID="e9cbc75827870351f64571a667566933420ca6bd698b6893dba6c179e73cad01" Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.418602 4867 scope.go:117] "RemoveContainer" containerID="3d74f91f89d8f9eb5dad764bb4973dd189cb2be2035932bf0c839c1b982d6051" Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.453075 4867 scope.go:117] "RemoveContainer" containerID="103b145a1165b7cd88ea843297c289a2e1548657462bac5297bc8f80e5180869" Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.507431 4867 scope.go:117] "RemoveContainer" containerID="e9cbc75827870351f64571a667566933420ca6bd698b6893dba6c179e73cad01" Oct 06 14:24:46 crc kubenswrapper[4867]: E1006 14:24:46.508392 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9cbc75827870351f64571a667566933420ca6bd698b6893dba6c179e73cad01\": container with ID starting with e9cbc75827870351f64571a667566933420ca6bd698b6893dba6c179e73cad01 not found: ID does not exist" containerID="e9cbc75827870351f64571a667566933420ca6bd698b6893dba6c179e73cad01" Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.508435 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9cbc75827870351f64571a667566933420ca6bd698b6893dba6c179e73cad01"} err="failed to get container status \"e9cbc75827870351f64571a667566933420ca6bd698b6893dba6c179e73cad01\": rpc error: code = NotFound desc = could not find container \"e9cbc75827870351f64571a667566933420ca6bd698b6893dba6c179e73cad01\": container with ID starting with e9cbc75827870351f64571a667566933420ca6bd698b6893dba6c179e73cad01 not found: ID does not exist" Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.508462 4867 scope.go:117] "RemoveContainer" containerID="3d74f91f89d8f9eb5dad764bb4973dd189cb2be2035932bf0c839c1b982d6051" Oct 06 14:24:46 crc kubenswrapper[4867]: E1006 14:24:46.508941 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d74f91f89d8f9eb5dad764bb4973dd189cb2be2035932bf0c839c1b982d6051\": container with ID starting with 3d74f91f89d8f9eb5dad764bb4973dd189cb2be2035932bf0c839c1b982d6051 not found: ID does not exist" containerID="3d74f91f89d8f9eb5dad764bb4973dd189cb2be2035932bf0c839c1b982d6051" Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.508985 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d74f91f89d8f9eb5dad764bb4973dd189cb2be2035932bf0c839c1b982d6051"} err="failed to get container status \"3d74f91f89d8f9eb5dad764bb4973dd189cb2be2035932bf0c839c1b982d6051\": rpc error: code = NotFound desc = could not find container \"3d74f91f89d8f9eb5dad764bb4973dd189cb2be2035932bf0c839c1b982d6051\": container with ID starting with 3d74f91f89d8f9eb5dad764bb4973dd189cb2be2035932bf0c839c1b982d6051 not found: ID does not exist" Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.509012 4867 scope.go:117] "RemoveContainer" containerID="103b145a1165b7cd88ea843297c289a2e1548657462bac5297bc8f80e5180869" Oct 06 14:24:46 crc kubenswrapper[4867]: E1006 14:24:46.509389 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"103b145a1165b7cd88ea843297c289a2e1548657462bac5297bc8f80e5180869\": container with ID starting with 103b145a1165b7cd88ea843297c289a2e1548657462bac5297bc8f80e5180869 not found: ID does not exist" containerID="103b145a1165b7cd88ea843297c289a2e1548657462bac5297bc8f80e5180869" Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.509420 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"103b145a1165b7cd88ea843297c289a2e1548657462bac5297bc8f80e5180869"} err="failed to get container status \"103b145a1165b7cd88ea843297c289a2e1548657462bac5297bc8f80e5180869\": rpc error: code = NotFound desc = could not find container \"103b145a1165b7cd88ea843297c289a2e1548657462bac5297bc8f80e5180869\": container with ID starting with 103b145a1165b7cd88ea843297c289a2e1548657462bac5297bc8f80e5180869 not found: ID does not exist" Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.565340 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnn44"] Oct 06 14:24:46 crc kubenswrapper[4867]: I1006 14:24:46.576433 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnn44"] Oct 06 14:24:47 crc kubenswrapper[4867]: I1006 14:24:47.235735 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="794ee00a-4589-46d3-9821-550a02d5caca" path="/var/lib/kubelet/pods/794ee00a-4589-46d3-9821-550a02d5caca/volumes" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.482460 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f85vr"] Oct 06 14:25:00 crc kubenswrapper[4867]: E1006 14:25:00.484344 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794ee00a-4589-46d3-9821-550a02d5caca" containerName="extract-utilities" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.484372 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="794ee00a-4589-46d3-9821-550a02d5caca" containerName="extract-utilities" Oct 06 14:25:00 crc kubenswrapper[4867]: E1006 14:25:00.484388 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794ee00a-4589-46d3-9821-550a02d5caca" containerName="registry-server" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.484396 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="794ee00a-4589-46d3-9821-550a02d5caca" containerName="registry-server" Oct 06 14:25:00 crc kubenswrapper[4867]: E1006 14:25:00.484412 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794ee00a-4589-46d3-9821-550a02d5caca" containerName="extract-content" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.484420 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="794ee00a-4589-46d3-9821-550a02d5caca" containerName="extract-content" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.484611 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="794ee00a-4589-46d3-9821-550a02d5caca" containerName="registry-server" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.487272 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.517722 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f85vr"] Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.600270 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7d5n\" (UniqueName: \"kubernetes.io/projected/f466559d-88ca-40ca-aa83-7dcf8cf436f4-kube-api-access-x7d5n\") pod \"certified-operators-f85vr\" (UID: \"f466559d-88ca-40ca-aa83-7dcf8cf436f4\") " pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.600343 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f466559d-88ca-40ca-aa83-7dcf8cf436f4-utilities\") pod \"certified-operators-f85vr\" (UID: \"f466559d-88ca-40ca-aa83-7dcf8cf436f4\") " pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.600374 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f466559d-88ca-40ca-aa83-7dcf8cf436f4-catalog-content\") pod \"certified-operators-f85vr\" (UID: \"f466559d-88ca-40ca-aa83-7dcf8cf436f4\") " pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.720342 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7d5n\" (UniqueName: \"kubernetes.io/projected/f466559d-88ca-40ca-aa83-7dcf8cf436f4-kube-api-access-x7d5n\") pod \"certified-operators-f85vr\" (UID: \"f466559d-88ca-40ca-aa83-7dcf8cf436f4\") " pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.720791 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f466559d-88ca-40ca-aa83-7dcf8cf436f4-utilities\") pod \"certified-operators-f85vr\" (UID: \"f466559d-88ca-40ca-aa83-7dcf8cf436f4\") " pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.720849 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f466559d-88ca-40ca-aa83-7dcf8cf436f4-catalog-content\") pod \"certified-operators-f85vr\" (UID: \"f466559d-88ca-40ca-aa83-7dcf8cf436f4\") " pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.721804 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f466559d-88ca-40ca-aa83-7dcf8cf436f4-utilities\") pod \"certified-operators-f85vr\" (UID: \"f466559d-88ca-40ca-aa83-7dcf8cf436f4\") " pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.721956 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f466559d-88ca-40ca-aa83-7dcf8cf436f4-catalog-content\") pod \"certified-operators-f85vr\" (UID: \"f466559d-88ca-40ca-aa83-7dcf8cf436f4\") " pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.750423 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7d5n\" (UniqueName: \"kubernetes.io/projected/f466559d-88ca-40ca-aa83-7dcf8cf436f4-kube-api-access-x7d5n\") pod \"certified-operators-f85vr\" (UID: \"f466559d-88ca-40ca-aa83-7dcf8cf436f4\") " pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:00 crc kubenswrapper[4867]: I1006 14:25:00.848873 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:01 crc kubenswrapper[4867]: I1006 14:25:01.522137 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f85vr"] Oct 06 14:25:01 crc kubenswrapper[4867]: I1006 14:25:01.546484 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f85vr" event={"ID":"f466559d-88ca-40ca-aa83-7dcf8cf436f4","Type":"ContainerStarted","Data":"29dcf76136b4a876dc8c29c8404bce174df1ebdc00615c78cd79a584e27c995c"} Oct 06 14:25:02 crc kubenswrapper[4867]: I1006 14:25:02.560475 4867 generic.go:334] "Generic (PLEG): container finished" podID="f466559d-88ca-40ca-aa83-7dcf8cf436f4" containerID="91b1814354ca5faf47da2ab826cb989bbc8978a8069760c7afff28f7d3a865d7" exitCode=0 Oct 06 14:25:02 crc kubenswrapper[4867]: I1006 14:25:02.561562 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f85vr" event={"ID":"f466559d-88ca-40ca-aa83-7dcf8cf436f4","Type":"ContainerDied","Data":"91b1814354ca5faf47da2ab826cb989bbc8978a8069760c7afff28f7d3a865d7"} Oct 06 14:25:07 crc kubenswrapper[4867]: I1006 14:25:07.629127 4867 generic.go:334] "Generic (PLEG): container finished" podID="f466559d-88ca-40ca-aa83-7dcf8cf436f4" containerID="2f6256fb174a036016c32490e08f554ebaa0f0dc1bc21e4913e3164b928d1073" exitCode=0 Oct 06 14:25:07 crc kubenswrapper[4867]: I1006 14:25:07.629281 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f85vr" event={"ID":"f466559d-88ca-40ca-aa83-7dcf8cf436f4","Type":"ContainerDied","Data":"2f6256fb174a036016c32490e08f554ebaa0f0dc1bc21e4913e3164b928d1073"} Oct 06 14:25:08 crc kubenswrapper[4867]: I1006 14:25:08.642149 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f85vr" event={"ID":"f466559d-88ca-40ca-aa83-7dcf8cf436f4","Type":"ContainerStarted","Data":"a1453902a2c11417526d37cdec3a48cb020916cbb2f6b5d9e839ad8021f2fd71"} Oct 06 14:25:08 crc kubenswrapper[4867]: I1006 14:25:08.673855 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f85vr" podStartSLOduration=3.2150426420000002 podStartE2EDuration="8.673822052s" podCreationTimestamp="2025-10-06 14:25:00 +0000 UTC" firstStartedPulling="2025-10-06 14:25:02.564119995 +0000 UTC m=+4882.022068139" lastFinishedPulling="2025-10-06 14:25:08.022899405 +0000 UTC m=+4887.480847549" observedRunningTime="2025-10-06 14:25:08.660570604 +0000 UTC m=+4888.118518758" watchObservedRunningTime="2025-10-06 14:25:08.673822052 +0000 UTC m=+4888.131770196" Oct 06 14:25:10 crc kubenswrapper[4867]: I1006 14:25:10.849525 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:10 crc kubenswrapper[4867]: I1006 14:25:10.850051 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:10 crc kubenswrapper[4867]: I1006 14:25:10.902907 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:21 crc kubenswrapper[4867]: I1006 14:25:21.237936 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f85vr" Oct 06 14:25:21 crc kubenswrapper[4867]: I1006 14:25:21.317966 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f85vr"] Oct 06 14:25:21 crc kubenswrapper[4867]: I1006 14:25:21.385355 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xxfqf"] Oct 06 14:25:21 crc kubenswrapper[4867]: I1006 14:25:21.385792 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xxfqf" podUID="dab5b2af-998d-46de-8029-37bc95f7abb0" containerName="registry-server" containerID="cri-o://1e0852cdb34741a73225e3e0df387537eed8688f70d84ffc44353bd7ef523c91" gracePeriod=2 Oct 06 14:25:21 crc kubenswrapper[4867]: I1006 14:25:21.771449 4867 generic.go:334] "Generic (PLEG): container finished" podID="dab5b2af-998d-46de-8029-37bc95f7abb0" containerID="1e0852cdb34741a73225e3e0df387537eed8688f70d84ffc44353bd7ef523c91" exitCode=0 Oct 06 14:25:21 crc kubenswrapper[4867]: I1006 14:25:21.773355 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfqf" event={"ID":"dab5b2af-998d-46de-8029-37bc95f7abb0","Type":"ContainerDied","Data":"1e0852cdb34741a73225e3e0df387537eed8688f70d84ffc44353bd7ef523c91"} Oct 06 14:25:21 crc kubenswrapper[4867]: I1006 14:25:21.903722 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.035424 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab5b2af-998d-46de-8029-37bc95f7abb0-catalog-content\") pod \"dab5b2af-998d-46de-8029-37bc95f7abb0\" (UID: \"dab5b2af-998d-46de-8029-37bc95f7abb0\") " Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.035553 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnxp9\" (UniqueName: \"kubernetes.io/projected/dab5b2af-998d-46de-8029-37bc95f7abb0-kube-api-access-bnxp9\") pod \"dab5b2af-998d-46de-8029-37bc95f7abb0\" (UID: \"dab5b2af-998d-46de-8029-37bc95f7abb0\") " Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.035746 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab5b2af-998d-46de-8029-37bc95f7abb0-utilities\") pod \"dab5b2af-998d-46de-8029-37bc95f7abb0\" (UID: \"dab5b2af-998d-46de-8029-37bc95f7abb0\") " Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.038065 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab5b2af-998d-46de-8029-37bc95f7abb0-utilities" (OuterVolumeSpecName: "utilities") pod "dab5b2af-998d-46de-8029-37bc95f7abb0" (UID: "dab5b2af-998d-46de-8029-37bc95f7abb0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.046834 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab5b2af-998d-46de-8029-37bc95f7abb0-kube-api-access-bnxp9" (OuterVolumeSpecName: "kube-api-access-bnxp9") pod "dab5b2af-998d-46de-8029-37bc95f7abb0" (UID: "dab5b2af-998d-46de-8029-37bc95f7abb0"). InnerVolumeSpecName "kube-api-access-bnxp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.096286 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab5b2af-998d-46de-8029-37bc95f7abb0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dab5b2af-998d-46de-8029-37bc95f7abb0" (UID: "dab5b2af-998d-46de-8029-37bc95f7abb0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.138451 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dab5b2af-998d-46de-8029-37bc95f7abb0-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.138808 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dab5b2af-998d-46de-8029-37bc95f7abb0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.138874 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnxp9\" (UniqueName: \"kubernetes.io/projected/dab5b2af-998d-46de-8029-37bc95f7abb0-kube-api-access-bnxp9\") on node \"crc\" DevicePath \"\"" Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.809959 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxfqf" event={"ID":"dab5b2af-998d-46de-8029-37bc95f7abb0","Type":"ContainerDied","Data":"7325c242b5851f88b47c1304abc829a114dc829522d195189c3d96c346c0aead"} Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.810463 4867 scope.go:117] "RemoveContainer" containerID="1e0852cdb34741a73225e3e0df387537eed8688f70d84ffc44353bd7ef523c91" Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.810115 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxfqf" Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.852379 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xxfqf"] Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.856520 4867 scope.go:117] "RemoveContainer" containerID="7862d997fa8da0f8166b2a77cacd05f1f50dbf6ce63755c3369827df36882f22" Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.861328 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xxfqf"] Oct 06 14:25:22 crc kubenswrapper[4867]: I1006 14:25:22.887361 4867 scope.go:117] "RemoveContainer" containerID="3a1ec7c804860a67684024edd77a89e5c13d1800009e337fb9648e3f65fae010" Oct 06 14:25:23 crc kubenswrapper[4867]: I1006 14:25:23.232318 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab5b2af-998d-46de-8029-37bc95f7abb0" path="/var/lib/kubelet/pods/dab5b2af-998d-46de-8029-37bc95f7abb0/volumes" Oct 06 14:27:12 crc kubenswrapper[4867]: I1006 14:27:12.873574 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:27:12 crc kubenswrapper[4867]: I1006 14:27:12.874327 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:27:42 crc kubenswrapper[4867]: I1006 14:27:42.873566 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:27:42 crc kubenswrapper[4867]: I1006 14:27:42.874598 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:28:12 crc kubenswrapper[4867]: I1006 14:28:12.873706 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:28:12 crc kubenswrapper[4867]: I1006 14:28:12.874176 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:28:12 crc kubenswrapper[4867]: I1006 14:28:12.874218 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 14:28:12 crc kubenswrapper[4867]: I1006 14:28:12.875118 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5962e81fccdfe31ac3ecff5cdae36fe695f010cdbe5374fcd4f1b6775b5533a1"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 14:28:12 crc kubenswrapper[4867]: I1006 14:28:12.875174 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://5962e81fccdfe31ac3ecff5cdae36fe695f010cdbe5374fcd4f1b6775b5533a1" gracePeriod=600 Oct 06 14:28:13 crc kubenswrapper[4867]: I1006 14:28:13.730772 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="5962e81fccdfe31ac3ecff5cdae36fe695f010cdbe5374fcd4f1b6775b5533a1" exitCode=0 Oct 06 14:28:13 crc kubenswrapper[4867]: I1006 14:28:13.730858 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"5962e81fccdfe31ac3ecff5cdae36fe695f010cdbe5374fcd4f1b6775b5533a1"} Oct 06 14:28:13 crc kubenswrapper[4867]: I1006 14:28:13.731509 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068"} Oct 06 14:28:13 crc kubenswrapper[4867]: I1006 14:28:13.731541 4867 scope.go:117] "RemoveContainer" containerID="705ab03b74a511c4c271b0951ace804b47868a6113729d5a2b838d58ef428246" Oct 06 14:29:19 crc kubenswrapper[4867]: E1006 14:29:19.897838 4867 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.198:50542->38.102.83.198:45409: write tcp 38.102.83.198:50542->38.102.83.198:45409: write: broken pipe Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.145324 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5"] Oct 06 14:30:00 crc kubenswrapper[4867]: E1006 14:30:00.146643 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab5b2af-998d-46de-8029-37bc95f7abb0" containerName="extract-utilities" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.146662 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab5b2af-998d-46de-8029-37bc95f7abb0" containerName="extract-utilities" Oct 06 14:30:00 crc kubenswrapper[4867]: E1006 14:30:00.146692 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab5b2af-998d-46de-8029-37bc95f7abb0" containerName="registry-server" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.146701 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab5b2af-998d-46de-8029-37bc95f7abb0" containerName="registry-server" Oct 06 14:30:00 crc kubenswrapper[4867]: E1006 14:30:00.146768 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab5b2af-998d-46de-8029-37bc95f7abb0" containerName="extract-content" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.146778 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab5b2af-998d-46de-8029-37bc95f7abb0" containerName="extract-content" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.147008 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab5b2af-998d-46de-8029-37bc95f7abb0" containerName="registry-server" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.148424 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.151943 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.152653 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.155431 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5"] Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.252521 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9tmd\" (UniqueName: \"kubernetes.io/projected/bf878bd0-8992-4ff9-be60-83e858b2a690-kube-api-access-k9tmd\") pod \"collect-profiles-29329350-dljz5\" (UID: \"bf878bd0-8992-4ff9-be60-83e858b2a690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.253034 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf878bd0-8992-4ff9-be60-83e858b2a690-secret-volume\") pod \"collect-profiles-29329350-dljz5\" (UID: \"bf878bd0-8992-4ff9-be60-83e858b2a690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.253381 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf878bd0-8992-4ff9-be60-83e858b2a690-config-volume\") pod \"collect-profiles-29329350-dljz5\" (UID: \"bf878bd0-8992-4ff9-be60-83e858b2a690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.355103 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf878bd0-8992-4ff9-be60-83e858b2a690-secret-volume\") pod \"collect-profiles-29329350-dljz5\" (UID: \"bf878bd0-8992-4ff9-be60-83e858b2a690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.355190 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf878bd0-8992-4ff9-be60-83e858b2a690-config-volume\") pod \"collect-profiles-29329350-dljz5\" (UID: \"bf878bd0-8992-4ff9-be60-83e858b2a690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.355264 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9tmd\" (UniqueName: \"kubernetes.io/projected/bf878bd0-8992-4ff9-be60-83e858b2a690-kube-api-access-k9tmd\") pod \"collect-profiles-29329350-dljz5\" (UID: \"bf878bd0-8992-4ff9-be60-83e858b2a690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.356071 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf878bd0-8992-4ff9-be60-83e858b2a690-config-volume\") pod \"collect-profiles-29329350-dljz5\" (UID: \"bf878bd0-8992-4ff9-be60-83e858b2a690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.360739 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf878bd0-8992-4ff9-be60-83e858b2a690-secret-volume\") pod \"collect-profiles-29329350-dljz5\" (UID: \"bf878bd0-8992-4ff9-be60-83e858b2a690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.371238 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9tmd\" (UniqueName: \"kubernetes.io/projected/bf878bd0-8992-4ff9-be60-83e858b2a690-kube-api-access-k9tmd\") pod \"collect-profiles-29329350-dljz5\" (UID: \"bf878bd0-8992-4ff9-be60-83e858b2a690\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.469367 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" Oct 06 14:30:00 crc kubenswrapper[4867]: I1006 14:30:00.925421 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5"] Oct 06 14:30:00 crc kubenswrapper[4867]: W1006 14:30:00.934098 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf878bd0_8992_4ff9_be60_83e858b2a690.slice/crio-f14c470f18367450bc3241a7481d756ae300bd424d295df3bf8df721a2eb5292 WatchSource:0}: Error finding container f14c470f18367450bc3241a7481d756ae300bd424d295df3bf8df721a2eb5292: Status 404 returned error can't find the container with id f14c470f18367450bc3241a7481d756ae300bd424d295df3bf8df721a2eb5292 Oct 06 14:30:01 crc kubenswrapper[4867]: I1006 14:30:01.792179 4867 generic.go:334] "Generic (PLEG): container finished" podID="bf878bd0-8992-4ff9-be60-83e858b2a690" containerID="10c33111632827d2d45505ee224db447324022cced5dabe98e534658e2441a9f" exitCode=0 Oct 06 14:30:01 crc kubenswrapper[4867]: I1006 14:30:01.792267 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" event={"ID":"bf878bd0-8992-4ff9-be60-83e858b2a690","Type":"ContainerDied","Data":"10c33111632827d2d45505ee224db447324022cced5dabe98e534658e2441a9f"} Oct 06 14:30:01 crc kubenswrapper[4867]: I1006 14:30:01.792551 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" event={"ID":"bf878bd0-8992-4ff9-be60-83e858b2a690","Type":"ContainerStarted","Data":"f14c470f18367450bc3241a7481d756ae300bd424d295df3bf8df721a2eb5292"} Oct 06 14:30:03 crc kubenswrapper[4867]: I1006 14:30:03.183166 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" Oct 06 14:30:03 crc kubenswrapper[4867]: I1006 14:30:03.329819 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf878bd0-8992-4ff9-be60-83e858b2a690-secret-volume\") pod \"bf878bd0-8992-4ff9-be60-83e858b2a690\" (UID: \"bf878bd0-8992-4ff9-be60-83e858b2a690\") " Oct 06 14:30:03 crc kubenswrapper[4867]: I1006 14:30:03.330307 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9tmd\" (UniqueName: \"kubernetes.io/projected/bf878bd0-8992-4ff9-be60-83e858b2a690-kube-api-access-k9tmd\") pod \"bf878bd0-8992-4ff9-be60-83e858b2a690\" (UID: \"bf878bd0-8992-4ff9-be60-83e858b2a690\") " Oct 06 14:30:03 crc kubenswrapper[4867]: I1006 14:30:03.330596 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf878bd0-8992-4ff9-be60-83e858b2a690-config-volume\") pod \"bf878bd0-8992-4ff9-be60-83e858b2a690\" (UID: \"bf878bd0-8992-4ff9-be60-83e858b2a690\") " Oct 06 14:30:03 crc kubenswrapper[4867]: I1006 14:30:03.331397 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf878bd0-8992-4ff9-be60-83e858b2a690-config-volume" (OuterVolumeSpecName: "config-volume") pod "bf878bd0-8992-4ff9-be60-83e858b2a690" (UID: "bf878bd0-8992-4ff9-be60-83e858b2a690"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:30:03 crc kubenswrapper[4867]: I1006 14:30:03.333233 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf878bd0-8992-4ff9-be60-83e858b2a690-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 14:30:03 crc kubenswrapper[4867]: I1006 14:30:03.337392 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf878bd0-8992-4ff9-be60-83e858b2a690-kube-api-access-k9tmd" (OuterVolumeSpecName: "kube-api-access-k9tmd") pod "bf878bd0-8992-4ff9-be60-83e858b2a690" (UID: "bf878bd0-8992-4ff9-be60-83e858b2a690"). InnerVolumeSpecName "kube-api-access-k9tmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:30:03 crc kubenswrapper[4867]: I1006 14:30:03.344396 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf878bd0-8992-4ff9-be60-83e858b2a690-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bf878bd0-8992-4ff9-be60-83e858b2a690" (UID: "bf878bd0-8992-4ff9-be60-83e858b2a690"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:30:03 crc kubenswrapper[4867]: I1006 14:30:03.435489 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9tmd\" (UniqueName: \"kubernetes.io/projected/bf878bd0-8992-4ff9-be60-83e858b2a690-kube-api-access-k9tmd\") on node \"crc\" DevicePath \"\"" Oct 06 14:30:03 crc kubenswrapper[4867]: I1006 14:30:03.435847 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bf878bd0-8992-4ff9-be60-83e858b2a690-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 14:30:03 crc kubenswrapper[4867]: I1006 14:30:03.813607 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" event={"ID":"bf878bd0-8992-4ff9-be60-83e858b2a690","Type":"ContainerDied","Data":"f14c470f18367450bc3241a7481d756ae300bd424d295df3bf8df721a2eb5292"} Oct 06 14:30:03 crc kubenswrapper[4867]: I1006 14:30:03.813653 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f14c470f18367450bc3241a7481d756ae300bd424d295df3bf8df721a2eb5292" Oct 06 14:30:03 crc kubenswrapper[4867]: I1006 14:30:03.813660 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329350-dljz5" Oct 06 14:30:04 crc kubenswrapper[4867]: I1006 14:30:04.274362 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj"] Oct 06 14:30:04 crc kubenswrapper[4867]: I1006 14:30:04.285980 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329305-rrfmj"] Oct 06 14:30:05 crc kubenswrapper[4867]: I1006 14:30:05.238205 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d873c69-5229-4308-ac4a-e6e83a067a42" path="/var/lib/kubelet/pods/2d873c69-5229-4308-ac4a-e6e83a067a42/volumes" Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.386661 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bpzds"] Oct 06 14:30:16 crc kubenswrapper[4867]: E1006 14:30:16.387832 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf878bd0-8992-4ff9-be60-83e858b2a690" containerName="collect-profiles" Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.387850 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf878bd0-8992-4ff9-be60-83e858b2a690" containerName="collect-profiles" Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.388077 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf878bd0-8992-4ff9-be60-83e858b2a690" containerName="collect-profiles" Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.390165 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.401461 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bpzds"] Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.557594 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56npb\" (UniqueName: \"kubernetes.io/projected/1afa423a-dc5a-4b79-b4ea-5868f9ea04b3-kube-api-access-56npb\") pod \"redhat-operators-bpzds\" (UID: \"1afa423a-dc5a-4b79-b4ea-5868f9ea04b3\") " pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.557722 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1afa423a-dc5a-4b79-b4ea-5868f9ea04b3-catalog-content\") pod \"redhat-operators-bpzds\" (UID: \"1afa423a-dc5a-4b79-b4ea-5868f9ea04b3\") " pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.557751 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1afa423a-dc5a-4b79-b4ea-5868f9ea04b3-utilities\") pod \"redhat-operators-bpzds\" (UID: \"1afa423a-dc5a-4b79-b4ea-5868f9ea04b3\") " pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.659932 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1afa423a-dc5a-4b79-b4ea-5868f9ea04b3-catalog-content\") pod \"redhat-operators-bpzds\" (UID: \"1afa423a-dc5a-4b79-b4ea-5868f9ea04b3\") " pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.659989 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1afa423a-dc5a-4b79-b4ea-5868f9ea04b3-utilities\") pod \"redhat-operators-bpzds\" (UID: \"1afa423a-dc5a-4b79-b4ea-5868f9ea04b3\") " pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.660094 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56npb\" (UniqueName: \"kubernetes.io/projected/1afa423a-dc5a-4b79-b4ea-5868f9ea04b3-kube-api-access-56npb\") pod \"redhat-operators-bpzds\" (UID: \"1afa423a-dc5a-4b79-b4ea-5868f9ea04b3\") " pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.660473 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1afa423a-dc5a-4b79-b4ea-5868f9ea04b3-catalog-content\") pod \"redhat-operators-bpzds\" (UID: \"1afa423a-dc5a-4b79-b4ea-5868f9ea04b3\") " pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.660624 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1afa423a-dc5a-4b79-b4ea-5868f9ea04b3-utilities\") pod \"redhat-operators-bpzds\" (UID: \"1afa423a-dc5a-4b79-b4ea-5868f9ea04b3\") " pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.689702 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56npb\" (UniqueName: \"kubernetes.io/projected/1afa423a-dc5a-4b79-b4ea-5868f9ea04b3-kube-api-access-56npb\") pod \"redhat-operators-bpzds\" (UID: \"1afa423a-dc5a-4b79-b4ea-5868f9ea04b3\") " pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:16 crc kubenswrapper[4867]: I1006 14:30:16.721642 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:17 crc kubenswrapper[4867]: I1006 14:30:17.216472 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bpzds"] Oct 06 14:30:17 crc kubenswrapper[4867]: I1006 14:30:17.959564 4867 generic.go:334] "Generic (PLEG): container finished" podID="1afa423a-dc5a-4b79-b4ea-5868f9ea04b3" containerID="e51fedd3d9ac922825c9016d31e81a5a4acf5a06f12f0f78cc1bb2847fb18d41" exitCode=0 Oct 06 14:30:17 crc kubenswrapper[4867]: I1006 14:30:17.959734 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpzds" event={"ID":"1afa423a-dc5a-4b79-b4ea-5868f9ea04b3","Type":"ContainerDied","Data":"e51fedd3d9ac922825c9016d31e81a5a4acf5a06f12f0f78cc1bb2847fb18d41"} Oct 06 14:30:17 crc kubenswrapper[4867]: I1006 14:30:17.959942 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpzds" event={"ID":"1afa423a-dc5a-4b79-b4ea-5868f9ea04b3","Type":"ContainerStarted","Data":"acfdd99828e3cbebab45fce8ca632eb7ab388b369d267e6ae4992439ef339953"} Oct 06 14:30:17 crc kubenswrapper[4867]: I1006 14:30:17.962922 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 14:30:29 crc kubenswrapper[4867]: I1006 14:30:29.077479 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpzds" event={"ID":"1afa423a-dc5a-4b79-b4ea-5868f9ea04b3","Type":"ContainerStarted","Data":"fadfc3677c63e54ea3a2066b2b48737e478cdc31cb0a2276c591744c8c650299"} Oct 06 14:30:30 crc kubenswrapper[4867]: I1006 14:30:30.091453 4867 generic.go:334] "Generic (PLEG): container finished" podID="1afa423a-dc5a-4b79-b4ea-5868f9ea04b3" containerID="fadfc3677c63e54ea3a2066b2b48737e478cdc31cb0a2276c591744c8c650299" exitCode=0 Oct 06 14:30:30 crc kubenswrapper[4867]: I1006 14:30:30.091528 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpzds" event={"ID":"1afa423a-dc5a-4b79-b4ea-5868f9ea04b3","Type":"ContainerDied","Data":"fadfc3677c63e54ea3a2066b2b48737e478cdc31cb0a2276c591744c8c650299"} Oct 06 14:30:33 crc kubenswrapper[4867]: I1006 14:30:33.126396 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpzds" event={"ID":"1afa423a-dc5a-4b79-b4ea-5868f9ea04b3","Type":"ContainerStarted","Data":"c8fb1555324b685a37272aeb676788e503c9e8a05e7a07e2a5dc9491eff3a634"} Oct 06 14:30:33 crc kubenswrapper[4867]: I1006 14:30:33.152835 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bpzds" podStartSLOduration=2.96394112 podStartE2EDuration="17.152814043s" podCreationTimestamp="2025-10-06 14:30:16 +0000 UTC" firstStartedPulling="2025-10-06 14:30:17.962674847 +0000 UTC m=+5197.420622991" lastFinishedPulling="2025-10-06 14:30:32.15154777 +0000 UTC m=+5211.609495914" observedRunningTime="2025-10-06 14:30:33.146617636 +0000 UTC m=+5212.604565830" watchObservedRunningTime="2025-10-06 14:30:33.152814043 +0000 UTC m=+5212.610762187" Oct 06 14:30:36 crc kubenswrapper[4867]: I1006 14:30:36.722381 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:36 crc kubenswrapper[4867]: I1006 14:30:36.722738 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:37 crc kubenswrapper[4867]: I1006 14:30:37.769784 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bpzds" podUID="1afa423a-dc5a-4b79-b4ea-5868f9ea04b3" containerName="registry-server" probeResult="failure" output=< Oct 06 14:30:37 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Oct 06 14:30:37 crc kubenswrapper[4867]: > Oct 06 14:30:42 crc kubenswrapper[4867]: I1006 14:30:42.874028 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:30:42 crc kubenswrapper[4867]: I1006 14:30:42.875011 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:30:46 crc kubenswrapper[4867]: I1006 14:30:46.783375 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:46 crc kubenswrapper[4867]: I1006 14:30:46.845950 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bpzds" Oct 06 14:30:47 crc kubenswrapper[4867]: I1006 14:30:47.413917 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bpzds"] Oct 06 14:30:47 crc kubenswrapper[4867]: I1006 14:30:47.596121 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g9mxp"] Oct 06 14:30:47 crc kubenswrapper[4867]: I1006 14:30:47.596475 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g9mxp" podUID="bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" containerName="registry-server" containerID="cri-o://c7ad7adc8925c97c33d433373e6d75ae9e63755eb956f4588a1c8154249f7454" gracePeriod=2 Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.132749 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.225802 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krr5p\" (UniqueName: \"kubernetes.io/projected/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-kube-api-access-krr5p\") pod \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\" (UID: \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\") " Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.226193 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-utilities\") pod \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\" (UID: \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\") " Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.226241 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-catalog-content\") pod \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\" (UID: \"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e\") " Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.227904 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-utilities" (OuterVolumeSpecName: "utilities") pod "bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" (UID: "bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.238617 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-kube-api-access-krr5p" (OuterVolumeSpecName: "kube-api-access-krr5p") pod "bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" (UID: "bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e"). InnerVolumeSpecName "kube-api-access-krr5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.310940 4867 generic.go:334] "Generic (PLEG): container finished" podID="bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" containerID="c7ad7adc8925c97c33d433373e6d75ae9e63755eb956f4588a1c8154249f7454" exitCode=0 Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.312460 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9mxp" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.313122 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9mxp" event={"ID":"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e","Type":"ContainerDied","Data":"c7ad7adc8925c97c33d433373e6d75ae9e63755eb956f4588a1c8154249f7454"} Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.313178 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9mxp" event={"ID":"bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e","Type":"ContainerDied","Data":"ef57b0b9fd44a576ed2997bb1bfeb341e0726f95440075cd1b435099ab58bb7c"} Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.313208 4867 scope.go:117] "RemoveContainer" containerID="c7ad7adc8925c97c33d433373e6d75ae9e63755eb956f4588a1c8154249f7454" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.330303 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krr5p\" (UniqueName: \"kubernetes.io/projected/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-kube-api-access-krr5p\") on node \"crc\" DevicePath \"\"" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.330351 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.334577 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" (UID: "bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.340904 4867 scope.go:117] "RemoveContainer" containerID="8118c5e349ffccd102f892525fe3651ad25d2a3aede17dd886ab9f0563f92b2a" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.376757 4867 scope.go:117] "RemoveContainer" containerID="0a5122eb65aab2f0bfb93b25f7d1309861c0c487122c29ee4d246f421a91ae49" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.425582 4867 scope.go:117] "RemoveContainer" containerID="c7ad7adc8925c97c33d433373e6d75ae9e63755eb956f4588a1c8154249f7454" Oct 06 14:30:48 crc kubenswrapper[4867]: E1006 14:30:48.428797 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7ad7adc8925c97c33d433373e6d75ae9e63755eb956f4588a1c8154249f7454\": container with ID starting with c7ad7adc8925c97c33d433373e6d75ae9e63755eb956f4588a1c8154249f7454 not found: ID does not exist" containerID="c7ad7adc8925c97c33d433373e6d75ae9e63755eb956f4588a1c8154249f7454" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.428846 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7ad7adc8925c97c33d433373e6d75ae9e63755eb956f4588a1c8154249f7454"} err="failed to get container status \"c7ad7adc8925c97c33d433373e6d75ae9e63755eb956f4588a1c8154249f7454\": rpc error: code = NotFound desc = could not find container \"c7ad7adc8925c97c33d433373e6d75ae9e63755eb956f4588a1c8154249f7454\": container with ID starting with c7ad7adc8925c97c33d433373e6d75ae9e63755eb956f4588a1c8154249f7454 not found: ID does not exist" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.428877 4867 scope.go:117] "RemoveContainer" containerID="8118c5e349ffccd102f892525fe3651ad25d2a3aede17dd886ab9f0563f92b2a" Oct 06 14:30:48 crc kubenswrapper[4867]: E1006 14:30:48.429531 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8118c5e349ffccd102f892525fe3651ad25d2a3aede17dd886ab9f0563f92b2a\": container with ID starting with 8118c5e349ffccd102f892525fe3651ad25d2a3aede17dd886ab9f0563f92b2a not found: ID does not exist" containerID="8118c5e349ffccd102f892525fe3651ad25d2a3aede17dd886ab9f0563f92b2a" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.429611 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8118c5e349ffccd102f892525fe3651ad25d2a3aede17dd886ab9f0563f92b2a"} err="failed to get container status \"8118c5e349ffccd102f892525fe3651ad25d2a3aede17dd886ab9f0563f92b2a\": rpc error: code = NotFound desc = could not find container \"8118c5e349ffccd102f892525fe3651ad25d2a3aede17dd886ab9f0563f92b2a\": container with ID starting with 8118c5e349ffccd102f892525fe3651ad25d2a3aede17dd886ab9f0563f92b2a not found: ID does not exist" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.429659 4867 scope.go:117] "RemoveContainer" containerID="0a5122eb65aab2f0bfb93b25f7d1309861c0c487122c29ee4d246f421a91ae49" Oct 06 14:30:48 crc kubenswrapper[4867]: E1006 14:30:48.430058 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a5122eb65aab2f0bfb93b25f7d1309861c0c487122c29ee4d246f421a91ae49\": container with ID starting with 0a5122eb65aab2f0bfb93b25f7d1309861c0c487122c29ee4d246f421a91ae49 not found: ID does not exist" containerID="0a5122eb65aab2f0bfb93b25f7d1309861c0c487122c29ee4d246f421a91ae49" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.430089 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5122eb65aab2f0bfb93b25f7d1309861c0c487122c29ee4d246f421a91ae49"} err="failed to get container status \"0a5122eb65aab2f0bfb93b25f7d1309861c0c487122c29ee4d246f421a91ae49\": rpc error: code = NotFound desc = could not find container \"0a5122eb65aab2f0bfb93b25f7d1309861c0c487122c29ee4d246f421a91ae49\": container with ID starting with 0a5122eb65aab2f0bfb93b25f7d1309861c0c487122c29ee4d246f421a91ae49 not found: ID does not exist" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.433618 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.663415 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g9mxp"] Oct 06 14:30:48 crc kubenswrapper[4867]: I1006 14:30:48.672188 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g9mxp"] Oct 06 14:30:49 crc kubenswrapper[4867]: I1006 14:30:49.234119 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" path="/var/lib/kubelet/pods/bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e/volumes" Oct 06 14:31:04 crc kubenswrapper[4867]: I1006 14:31:04.221345 4867 scope.go:117] "RemoveContainer" containerID="1e435bd000c504f249c257cde5c809373df810cbc73b285f2c70475395899636" Oct 06 14:31:12 crc kubenswrapper[4867]: I1006 14:31:12.873873 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:31:12 crc kubenswrapper[4867]: I1006 14:31:12.874752 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:31:42 crc kubenswrapper[4867]: I1006 14:31:42.873978 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:31:42 crc kubenswrapper[4867]: I1006 14:31:42.874858 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:31:42 crc kubenswrapper[4867]: I1006 14:31:42.874920 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 14:31:42 crc kubenswrapper[4867]: I1006 14:31:42.875828 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 14:31:42 crc kubenswrapper[4867]: I1006 14:31:42.875898 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" gracePeriod=600 Oct 06 14:31:43 crc kubenswrapper[4867]: E1006 14:31:43.001100 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:31:43 crc kubenswrapper[4867]: I1006 14:31:43.922356 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" exitCode=0 Oct 06 14:31:43 crc kubenswrapper[4867]: I1006 14:31:43.922395 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068"} Oct 06 14:31:43 crc kubenswrapper[4867]: I1006 14:31:43.922961 4867 scope.go:117] "RemoveContainer" containerID="5962e81fccdfe31ac3ecff5cdae36fe695f010cdbe5374fcd4f1b6775b5533a1" Oct 06 14:31:43 crc kubenswrapper[4867]: I1006 14:31:43.928977 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:31:43 crc kubenswrapper[4867]: E1006 14:31:43.929768 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:31:55 crc kubenswrapper[4867]: I1006 14:31:55.222323 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:31:55 crc kubenswrapper[4867]: E1006 14:31:55.223292 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:32:07 crc kubenswrapper[4867]: I1006 14:32:07.223639 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:32:07 crc kubenswrapper[4867]: E1006 14:32:07.225478 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:32:21 crc kubenswrapper[4867]: I1006 14:32:21.230676 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:32:21 crc kubenswrapper[4867]: E1006 14:32:21.232544 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:32:32 crc kubenswrapper[4867]: I1006 14:32:32.222272 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:32:32 crc kubenswrapper[4867]: E1006 14:32:32.223620 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:32:46 crc kubenswrapper[4867]: I1006 14:32:46.220934 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:32:46 crc kubenswrapper[4867]: E1006 14:32:46.222005 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:32:57 crc kubenswrapper[4867]: I1006 14:32:57.222440 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:32:57 crc kubenswrapper[4867]: E1006 14:32:57.223272 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:33:11 crc kubenswrapper[4867]: I1006 14:33:11.227723 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:33:11 crc kubenswrapper[4867]: E1006 14:33:11.228604 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:33:24 crc kubenswrapper[4867]: I1006 14:33:24.222877 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:33:24 crc kubenswrapper[4867]: E1006 14:33:24.223902 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:33:38 crc kubenswrapper[4867]: I1006 14:33:38.221724 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:33:38 crc kubenswrapper[4867]: E1006 14:33:38.222554 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:33:51 crc kubenswrapper[4867]: I1006 14:33:51.229325 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:33:51 crc kubenswrapper[4867]: E1006 14:33:51.231774 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:34:06 crc kubenswrapper[4867]: I1006 14:34:06.221808 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:34:06 crc kubenswrapper[4867]: E1006 14:34:06.222655 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:34:19 crc kubenswrapper[4867]: I1006 14:34:19.221680 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:34:19 crc kubenswrapper[4867]: E1006 14:34:19.222521 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:34:34 crc kubenswrapper[4867]: I1006 14:34:34.221364 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:34:34 crc kubenswrapper[4867]: E1006 14:34:34.222336 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:34:49 crc kubenswrapper[4867]: I1006 14:34:49.221821 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:34:49 crc kubenswrapper[4867]: E1006 14:34:49.222875 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.366591 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qjzvj"] Oct 06 14:34:51 crc kubenswrapper[4867]: E1006 14:34:51.367843 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" containerName="registry-server" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.367860 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" containerName="registry-server" Oct 06 14:34:51 crc kubenswrapper[4867]: E1006 14:34:51.367876 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" containerName="extract-utilities" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.367884 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" containerName="extract-utilities" Oct 06 14:34:51 crc kubenswrapper[4867]: E1006 14:34:51.367897 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" containerName="extract-content" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.367914 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" containerName="extract-content" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.368154 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="bafad5d0-f8dc-4baa-a2ef-c4456e8b1c5e" containerName="registry-server" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.369952 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.385115 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjzvj"] Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.490581 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2266e41f-180f-4e83-bde7-2afbd37e9a0a-catalog-content\") pod \"redhat-marketplace-qjzvj\" (UID: \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\") " pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.490638 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zwgr\" (UniqueName: \"kubernetes.io/projected/2266e41f-180f-4e83-bde7-2afbd37e9a0a-kube-api-access-4zwgr\") pod \"redhat-marketplace-qjzvj\" (UID: \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\") " pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.490704 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2266e41f-180f-4e83-bde7-2afbd37e9a0a-utilities\") pod \"redhat-marketplace-qjzvj\" (UID: \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\") " pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.592928 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2266e41f-180f-4e83-bde7-2afbd37e9a0a-utilities\") pod \"redhat-marketplace-qjzvj\" (UID: \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\") " pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.593102 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2266e41f-180f-4e83-bde7-2afbd37e9a0a-catalog-content\") pod \"redhat-marketplace-qjzvj\" (UID: \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\") " pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.593149 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zwgr\" (UniqueName: \"kubernetes.io/projected/2266e41f-180f-4e83-bde7-2afbd37e9a0a-kube-api-access-4zwgr\") pod \"redhat-marketplace-qjzvj\" (UID: \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\") " pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.593501 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2266e41f-180f-4e83-bde7-2afbd37e9a0a-utilities\") pod \"redhat-marketplace-qjzvj\" (UID: \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\") " pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.593536 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2266e41f-180f-4e83-bde7-2afbd37e9a0a-catalog-content\") pod \"redhat-marketplace-qjzvj\" (UID: \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\") " pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.629132 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zwgr\" (UniqueName: \"kubernetes.io/projected/2266e41f-180f-4e83-bde7-2afbd37e9a0a-kube-api-access-4zwgr\") pod \"redhat-marketplace-qjzvj\" (UID: \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\") " pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:34:51 crc kubenswrapper[4867]: I1006 14:34:51.699618 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:34:52 crc kubenswrapper[4867]: I1006 14:34:52.150054 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjzvj"] Oct 06 14:34:52 crc kubenswrapper[4867]: I1006 14:34:52.981893 4867 generic.go:334] "Generic (PLEG): container finished" podID="2266e41f-180f-4e83-bde7-2afbd37e9a0a" containerID="e0fd453b7d423a4350ef8b40021041812b9830dea84943df3e73098aac49adb5" exitCode=0 Oct 06 14:34:52 crc kubenswrapper[4867]: I1006 14:34:52.981952 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjzvj" event={"ID":"2266e41f-180f-4e83-bde7-2afbd37e9a0a","Type":"ContainerDied","Data":"e0fd453b7d423a4350ef8b40021041812b9830dea84943df3e73098aac49adb5"} Oct 06 14:34:52 crc kubenswrapper[4867]: I1006 14:34:52.982227 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjzvj" event={"ID":"2266e41f-180f-4e83-bde7-2afbd37e9a0a","Type":"ContainerStarted","Data":"17807579fe854e05504fa595d7a5b3d05cf5309c9f41a54b23441247f8675cd6"} Oct 06 14:34:53 crc kubenswrapper[4867]: I1006 14:34:53.993784 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjzvj" event={"ID":"2266e41f-180f-4e83-bde7-2afbd37e9a0a","Type":"ContainerStarted","Data":"d5fe7bd64c58b710d2717a15fc3c370518963047b4fa8d052744cb2620299b58"} Oct 06 14:34:55 crc kubenswrapper[4867]: I1006 14:34:55.005285 4867 generic.go:334] "Generic (PLEG): container finished" podID="2266e41f-180f-4e83-bde7-2afbd37e9a0a" containerID="d5fe7bd64c58b710d2717a15fc3c370518963047b4fa8d052744cb2620299b58" exitCode=0 Oct 06 14:34:55 crc kubenswrapper[4867]: I1006 14:34:55.005343 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjzvj" event={"ID":"2266e41f-180f-4e83-bde7-2afbd37e9a0a","Type":"ContainerDied","Data":"d5fe7bd64c58b710d2717a15fc3c370518963047b4fa8d052744cb2620299b58"} Oct 06 14:34:56 crc kubenswrapper[4867]: I1006 14:34:56.018679 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjzvj" event={"ID":"2266e41f-180f-4e83-bde7-2afbd37e9a0a","Type":"ContainerStarted","Data":"dfd1071cc1fa2f83102fa78af02300bda8b84b9bcd89916417bbfc37301979c3"} Oct 06 14:34:56 crc kubenswrapper[4867]: I1006 14:34:56.045567 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qjzvj" podStartSLOduration=2.606931623 podStartE2EDuration="5.045550489s" podCreationTimestamp="2025-10-06 14:34:51 +0000 UTC" firstStartedPulling="2025-10-06 14:34:52.985317279 +0000 UTC m=+5472.443265423" lastFinishedPulling="2025-10-06 14:34:55.423936135 +0000 UTC m=+5474.881884289" observedRunningTime="2025-10-06 14:34:56.037532805 +0000 UTC m=+5475.495480949" watchObservedRunningTime="2025-10-06 14:34:56.045550489 +0000 UTC m=+5475.503498633" Oct 06 14:35:01 crc kubenswrapper[4867]: I1006 14:35:01.229460 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:35:01 crc kubenswrapper[4867]: E1006 14:35:01.230239 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:35:01 crc kubenswrapper[4867]: I1006 14:35:01.701028 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:35:01 crc kubenswrapper[4867]: I1006 14:35:01.701390 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:35:01 crc kubenswrapper[4867]: I1006 14:35:01.773111 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:35:02 crc kubenswrapper[4867]: I1006 14:35:02.158788 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:35:02 crc kubenswrapper[4867]: I1006 14:35:02.226896 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjzvj"] Oct 06 14:35:04 crc kubenswrapper[4867]: I1006 14:35:04.101961 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qjzvj" podUID="2266e41f-180f-4e83-bde7-2afbd37e9a0a" containerName="registry-server" containerID="cri-o://dfd1071cc1fa2f83102fa78af02300bda8b84b9bcd89916417bbfc37301979c3" gracePeriod=2 Oct 06 14:35:04 crc kubenswrapper[4867]: I1006 14:35:04.590036 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:35:04 crc kubenswrapper[4867]: I1006 14:35:04.789274 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2266e41f-180f-4e83-bde7-2afbd37e9a0a-catalog-content\") pod \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\" (UID: \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\") " Oct 06 14:35:04 crc kubenswrapper[4867]: I1006 14:35:04.789783 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zwgr\" (UniqueName: \"kubernetes.io/projected/2266e41f-180f-4e83-bde7-2afbd37e9a0a-kube-api-access-4zwgr\") pod \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\" (UID: \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\") " Oct 06 14:35:04 crc kubenswrapper[4867]: I1006 14:35:04.789841 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2266e41f-180f-4e83-bde7-2afbd37e9a0a-utilities\") pod \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\" (UID: \"2266e41f-180f-4e83-bde7-2afbd37e9a0a\") " Oct 06 14:35:04 crc kubenswrapper[4867]: I1006 14:35:04.790600 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2266e41f-180f-4e83-bde7-2afbd37e9a0a-utilities" (OuterVolumeSpecName: "utilities") pod "2266e41f-180f-4e83-bde7-2afbd37e9a0a" (UID: "2266e41f-180f-4e83-bde7-2afbd37e9a0a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:35:04 crc kubenswrapper[4867]: I1006 14:35:04.802448 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2266e41f-180f-4e83-bde7-2afbd37e9a0a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2266e41f-180f-4e83-bde7-2afbd37e9a0a" (UID: "2266e41f-180f-4e83-bde7-2afbd37e9a0a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:35:04 crc kubenswrapper[4867]: I1006 14:35:04.804282 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2266e41f-180f-4e83-bde7-2afbd37e9a0a-kube-api-access-4zwgr" (OuterVolumeSpecName: "kube-api-access-4zwgr") pod "2266e41f-180f-4e83-bde7-2afbd37e9a0a" (UID: "2266e41f-180f-4e83-bde7-2afbd37e9a0a"). InnerVolumeSpecName "kube-api-access-4zwgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:35:04 crc kubenswrapper[4867]: I1006 14:35:04.891763 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zwgr\" (UniqueName: \"kubernetes.io/projected/2266e41f-180f-4e83-bde7-2afbd37e9a0a-kube-api-access-4zwgr\") on node \"crc\" DevicePath \"\"" Oct 06 14:35:04 crc kubenswrapper[4867]: I1006 14:35:04.891813 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2266e41f-180f-4e83-bde7-2afbd37e9a0a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:35:04 crc kubenswrapper[4867]: I1006 14:35:04.891823 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2266e41f-180f-4e83-bde7-2afbd37e9a0a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.114118 4867 generic.go:334] "Generic (PLEG): container finished" podID="2266e41f-180f-4e83-bde7-2afbd37e9a0a" containerID="dfd1071cc1fa2f83102fa78af02300bda8b84b9bcd89916417bbfc37301979c3" exitCode=0 Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.114171 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjzvj" event={"ID":"2266e41f-180f-4e83-bde7-2afbd37e9a0a","Type":"ContainerDied","Data":"dfd1071cc1fa2f83102fa78af02300bda8b84b9bcd89916417bbfc37301979c3"} Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.114219 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjzvj" event={"ID":"2266e41f-180f-4e83-bde7-2afbd37e9a0a","Type":"ContainerDied","Data":"17807579fe854e05504fa595d7a5b3d05cf5309c9f41a54b23441247f8675cd6"} Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.114235 4867 scope.go:117] "RemoveContainer" containerID="dfd1071cc1fa2f83102fa78af02300bda8b84b9bcd89916417bbfc37301979c3" Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.114178 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjzvj" Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.144301 4867 scope.go:117] "RemoveContainer" containerID="d5fe7bd64c58b710d2717a15fc3c370518963047b4fa8d052744cb2620299b58" Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.156666 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjzvj"] Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.171994 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjzvj"] Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.185042 4867 scope.go:117] "RemoveContainer" containerID="e0fd453b7d423a4350ef8b40021041812b9830dea84943df3e73098aac49adb5" Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.218218 4867 scope.go:117] "RemoveContainer" containerID="dfd1071cc1fa2f83102fa78af02300bda8b84b9bcd89916417bbfc37301979c3" Oct 06 14:35:05 crc kubenswrapper[4867]: E1006 14:35:05.218654 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfd1071cc1fa2f83102fa78af02300bda8b84b9bcd89916417bbfc37301979c3\": container with ID starting with dfd1071cc1fa2f83102fa78af02300bda8b84b9bcd89916417bbfc37301979c3 not found: ID does not exist" containerID="dfd1071cc1fa2f83102fa78af02300bda8b84b9bcd89916417bbfc37301979c3" Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.218681 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfd1071cc1fa2f83102fa78af02300bda8b84b9bcd89916417bbfc37301979c3"} err="failed to get container status \"dfd1071cc1fa2f83102fa78af02300bda8b84b9bcd89916417bbfc37301979c3\": rpc error: code = NotFound desc = could not find container \"dfd1071cc1fa2f83102fa78af02300bda8b84b9bcd89916417bbfc37301979c3\": container with ID starting with dfd1071cc1fa2f83102fa78af02300bda8b84b9bcd89916417bbfc37301979c3 not found: ID does not exist" Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.218702 4867 scope.go:117] "RemoveContainer" containerID="d5fe7bd64c58b710d2717a15fc3c370518963047b4fa8d052744cb2620299b58" Oct 06 14:35:05 crc kubenswrapper[4867]: E1006 14:35:05.219092 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5fe7bd64c58b710d2717a15fc3c370518963047b4fa8d052744cb2620299b58\": container with ID starting with d5fe7bd64c58b710d2717a15fc3c370518963047b4fa8d052744cb2620299b58 not found: ID does not exist" containerID="d5fe7bd64c58b710d2717a15fc3c370518963047b4fa8d052744cb2620299b58" Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.219114 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5fe7bd64c58b710d2717a15fc3c370518963047b4fa8d052744cb2620299b58"} err="failed to get container status \"d5fe7bd64c58b710d2717a15fc3c370518963047b4fa8d052744cb2620299b58\": rpc error: code = NotFound desc = could not find container \"d5fe7bd64c58b710d2717a15fc3c370518963047b4fa8d052744cb2620299b58\": container with ID starting with d5fe7bd64c58b710d2717a15fc3c370518963047b4fa8d052744cb2620299b58 not found: ID does not exist" Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.219126 4867 scope.go:117] "RemoveContainer" containerID="e0fd453b7d423a4350ef8b40021041812b9830dea84943df3e73098aac49adb5" Oct 06 14:35:05 crc kubenswrapper[4867]: E1006 14:35:05.219612 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0fd453b7d423a4350ef8b40021041812b9830dea84943df3e73098aac49adb5\": container with ID starting with e0fd453b7d423a4350ef8b40021041812b9830dea84943df3e73098aac49adb5 not found: ID does not exist" containerID="e0fd453b7d423a4350ef8b40021041812b9830dea84943df3e73098aac49adb5" Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.219631 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0fd453b7d423a4350ef8b40021041812b9830dea84943df3e73098aac49adb5"} err="failed to get container status \"e0fd453b7d423a4350ef8b40021041812b9830dea84943df3e73098aac49adb5\": rpc error: code = NotFound desc = could not find container \"e0fd453b7d423a4350ef8b40021041812b9830dea84943df3e73098aac49adb5\": container with ID starting with e0fd453b7d423a4350ef8b40021041812b9830dea84943df3e73098aac49adb5 not found: ID does not exist" Oct 06 14:35:05 crc kubenswrapper[4867]: I1006 14:35:05.235380 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2266e41f-180f-4e83-bde7-2afbd37e9a0a" path="/var/lib/kubelet/pods/2266e41f-180f-4e83-bde7-2afbd37e9a0a/volumes" Oct 06 14:35:13 crc kubenswrapper[4867]: I1006 14:35:13.222331 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:35:13 crc kubenswrapper[4867]: E1006 14:35:13.223129 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:35:25 crc kubenswrapper[4867]: I1006 14:35:25.222713 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:35:25 crc kubenswrapper[4867]: E1006 14:35:25.223936 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:35:39 crc kubenswrapper[4867]: I1006 14:35:39.221561 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:35:39 crc kubenswrapper[4867]: E1006 14:35:39.222378 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:35:53 crc kubenswrapper[4867]: I1006 14:35:53.221221 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:35:53 crc kubenswrapper[4867]: E1006 14:35:53.222007 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:36:05 crc kubenswrapper[4867]: I1006 14:36:05.221688 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:36:05 crc kubenswrapper[4867]: E1006 14:36:05.222667 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:36:20 crc kubenswrapper[4867]: I1006 14:36:20.220914 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:36:20 crc kubenswrapper[4867]: E1006 14:36:20.222850 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:36:33 crc kubenswrapper[4867]: I1006 14:36:33.221981 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:36:33 crc kubenswrapper[4867]: E1006 14:36:33.222813 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:36:46 crc kubenswrapper[4867]: I1006 14:36:46.220998 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:36:47 crc kubenswrapper[4867]: I1006 14:36:47.130541 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"f9da5b3ec3f12078011440e5be465b37e5aa4d49c61b498acf56e70c5ba427e7"} Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.499294 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t77p5"] Oct 06 14:38:10 crc kubenswrapper[4867]: E1006 14:38:10.500612 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2266e41f-180f-4e83-bde7-2afbd37e9a0a" containerName="registry-server" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.500635 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2266e41f-180f-4e83-bde7-2afbd37e9a0a" containerName="registry-server" Oct 06 14:38:10 crc kubenswrapper[4867]: E1006 14:38:10.500653 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2266e41f-180f-4e83-bde7-2afbd37e9a0a" containerName="extract-content" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.500661 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2266e41f-180f-4e83-bde7-2afbd37e9a0a" containerName="extract-content" Oct 06 14:38:10 crc kubenswrapper[4867]: E1006 14:38:10.500700 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2266e41f-180f-4e83-bde7-2afbd37e9a0a" containerName="extract-utilities" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.500710 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="2266e41f-180f-4e83-bde7-2afbd37e9a0a" containerName="extract-utilities" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.501003 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="2266e41f-180f-4e83-bde7-2afbd37e9a0a" containerName="registry-server" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.503068 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.526938 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t77p5"] Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.665392 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-utilities\") pod \"certified-operators-t77p5\" (UID: \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\") " pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.665513 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-catalog-content\") pod \"certified-operators-t77p5\" (UID: \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\") " pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.665893 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg6h4\" (UniqueName: \"kubernetes.io/projected/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-kube-api-access-sg6h4\") pod \"certified-operators-t77p5\" (UID: \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\") " pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.769027 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg6h4\" (UniqueName: \"kubernetes.io/projected/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-kube-api-access-sg6h4\") pod \"certified-operators-t77p5\" (UID: \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\") " pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.769179 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-utilities\") pod \"certified-operators-t77p5\" (UID: \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\") " pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.769233 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-catalog-content\") pod \"certified-operators-t77p5\" (UID: \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\") " pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.769965 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-catalog-content\") pod \"certified-operators-t77p5\" (UID: \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\") " pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.770065 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-utilities\") pod \"certified-operators-t77p5\" (UID: \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\") " pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.799281 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg6h4\" (UniqueName: \"kubernetes.io/projected/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-kube-api-access-sg6h4\") pod \"certified-operators-t77p5\" (UID: \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\") " pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:10 crc kubenswrapper[4867]: I1006 14:38:10.827205 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:11 crc kubenswrapper[4867]: I1006 14:38:11.456909 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t77p5"] Oct 06 14:38:12 crc kubenswrapper[4867]: I1006 14:38:12.025509 4867 generic.go:334] "Generic (PLEG): container finished" podID="e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" containerID="0ec8bda9ba28468b8f2e8f6ceb0795f3f0dbc3397ea50056140b1308690af88f" exitCode=0 Oct 06 14:38:12 crc kubenswrapper[4867]: I1006 14:38:12.025588 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t77p5" event={"ID":"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c","Type":"ContainerDied","Data":"0ec8bda9ba28468b8f2e8f6ceb0795f3f0dbc3397ea50056140b1308690af88f"} Oct 06 14:38:12 crc kubenswrapper[4867]: I1006 14:38:12.025987 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t77p5" event={"ID":"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c","Type":"ContainerStarted","Data":"fc681790ad716d3af667c7ab272cda36395d035adfdcdd187dc07bf3bedea6ba"} Oct 06 14:38:12 crc kubenswrapper[4867]: I1006 14:38:12.029729 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 14:38:13 crc kubenswrapper[4867]: I1006 14:38:13.043818 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t77p5" event={"ID":"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c","Type":"ContainerStarted","Data":"85ea8a1df670826f6c75544889918704b33a305bba817e44d75490179ca59124"} Oct 06 14:38:13 crc kubenswrapper[4867]: I1006 14:38:13.913316 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m87hx"] Oct 06 14:38:13 crc kubenswrapper[4867]: I1006 14:38:13.916696 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:13 crc kubenswrapper[4867]: I1006 14:38:13.930488 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m87hx"] Oct 06 14:38:14 crc kubenswrapper[4867]: I1006 14:38:14.055654 4867 generic.go:334] "Generic (PLEG): container finished" podID="e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" containerID="85ea8a1df670826f6c75544889918704b33a305bba817e44d75490179ca59124" exitCode=0 Oct 06 14:38:14 crc kubenswrapper[4867]: I1006 14:38:14.055722 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t77p5" event={"ID":"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c","Type":"ContainerDied","Data":"85ea8a1df670826f6c75544889918704b33a305bba817e44d75490179ca59124"} Oct 06 14:38:14 crc kubenswrapper[4867]: I1006 14:38:14.077289 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmbgm\" (UniqueName: \"kubernetes.io/projected/833b8594-3a46-414b-9496-2f02971a2d0b-kube-api-access-jmbgm\") pod \"community-operators-m87hx\" (UID: \"833b8594-3a46-414b-9496-2f02971a2d0b\") " pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:14 crc kubenswrapper[4867]: I1006 14:38:14.077548 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/833b8594-3a46-414b-9496-2f02971a2d0b-utilities\") pod \"community-operators-m87hx\" (UID: \"833b8594-3a46-414b-9496-2f02971a2d0b\") " pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:14 crc kubenswrapper[4867]: I1006 14:38:14.077615 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/833b8594-3a46-414b-9496-2f02971a2d0b-catalog-content\") pod \"community-operators-m87hx\" (UID: \"833b8594-3a46-414b-9496-2f02971a2d0b\") " pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:14 crc kubenswrapper[4867]: I1006 14:38:14.179199 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/833b8594-3a46-414b-9496-2f02971a2d0b-utilities\") pod \"community-operators-m87hx\" (UID: \"833b8594-3a46-414b-9496-2f02971a2d0b\") " pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:14 crc kubenswrapper[4867]: I1006 14:38:14.179316 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/833b8594-3a46-414b-9496-2f02971a2d0b-catalog-content\") pod \"community-operators-m87hx\" (UID: \"833b8594-3a46-414b-9496-2f02971a2d0b\") " pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:14 crc kubenswrapper[4867]: I1006 14:38:14.179518 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmbgm\" (UniqueName: \"kubernetes.io/projected/833b8594-3a46-414b-9496-2f02971a2d0b-kube-api-access-jmbgm\") pod \"community-operators-m87hx\" (UID: \"833b8594-3a46-414b-9496-2f02971a2d0b\") " pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:14 crc kubenswrapper[4867]: I1006 14:38:14.180152 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/833b8594-3a46-414b-9496-2f02971a2d0b-catalog-content\") pod \"community-operators-m87hx\" (UID: \"833b8594-3a46-414b-9496-2f02971a2d0b\") " pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:14 crc kubenswrapper[4867]: I1006 14:38:14.180175 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/833b8594-3a46-414b-9496-2f02971a2d0b-utilities\") pod \"community-operators-m87hx\" (UID: \"833b8594-3a46-414b-9496-2f02971a2d0b\") " pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:14 crc kubenswrapper[4867]: I1006 14:38:14.213666 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmbgm\" (UniqueName: \"kubernetes.io/projected/833b8594-3a46-414b-9496-2f02971a2d0b-kube-api-access-jmbgm\") pod \"community-operators-m87hx\" (UID: \"833b8594-3a46-414b-9496-2f02971a2d0b\") " pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:14 crc kubenswrapper[4867]: I1006 14:38:14.266812 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:14 crc kubenswrapper[4867]: I1006 14:38:14.952146 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m87hx"] Oct 06 14:38:15 crc kubenswrapper[4867]: I1006 14:38:15.070362 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t77p5" event={"ID":"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c","Type":"ContainerStarted","Data":"2fcd25e9bf63cdaabfa213c19adb1f3cd4a2603f92b20ed67fa0b47102f0d2bc"} Oct 06 14:38:15 crc kubenswrapper[4867]: I1006 14:38:15.076055 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m87hx" event={"ID":"833b8594-3a46-414b-9496-2f02971a2d0b","Type":"ContainerStarted","Data":"5f6d376fd1a02bd5f86a8b0d7ac3dad5286c2504173727a63fc9f58ccbefec10"} Oct 06 14:38:15 crc kubenswrapper[4867]: I1006 14:38:15.104886 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t77p5" podStartSLOduration=2.592858822 podStartE2EDuration="5.104860393s" podCreationTimestamp="2025-10-06 14:38:10 +0000 UTC" firstStartedPulling="2025-10-06 14:38:12.029474267 +0000 UTC m=+5671.487422401" lastFinishedPulling="2025-10-06 14:38:14.541475828 +0000 UTC m=+5673.999423972" observedRunningTime="2025-10-06 14:38:15.08983636 +0000 UTC m=+5674.547784524" watchObservedRunningTime="2025-10-06 14:38:15.104860393 +0000 UTC m=+5674.562808537" Oct 06 14:38:16 crc kubenswrapper[4867]: I1006 14:38:16.092209 4867 generic.go:334] "Generic (PLEG): container finished" podID="833b8594-3a46-414b-9496-2f02971a2d0b" containerID="43ebb4d7e1c7b425c96a31f74a749679ae7b11ef730f402e48dcabe0a166fcfd" exitCode=0 Oct 06 14:38:16 crc kubenswrapper[4867]: I1006 14:38:16.092402 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m87hx" event={"ID":"833b8594-3a46-414b-9496-2f02971a2d0b","Type":"ContainerDied","Data":"43ebb4d7e1c7b425c96a31f74a749679ae7b11ef730f402e48dcabe0a166fcfd"} Oct 06 14:38:18 crc kubenswrapper[4867]: I1006 14:38:18.116502 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m87hx" event={"ID":"833b8594-3a46-414b-9496-2f02971a2d0b","Type":"ContainerStarted","Data":"104f532982fd59a9303b3802db4214ff26211f0e5e6e5f629d5d861882b75401"} Oct 06 14:38:19 crc kubenswrapper[4867]: I1006 14:38:19.128590 4867 generic.go:334] "Generic (PLEG): container finished" podID="833b8594-3a46-414b-9496-2f02971a2d0b" containerID="104f532982fd59a9303b3802db4214ff26211f0e5e6e5f629d5d861882b75401" exitCode=0 Oct 06 14:38:19 crc kubenswrapper[4867]: I1006 14:38:19.128649 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m87hx" event={"ID":"833b8594-3a46-414b-9496-2f02971a2d0b","Type":"ContainerDied","Data":"104f532982fd59a9303b3802db4214ff26211f0e5e6e5f629d5d861882b75401"} Oct 06 14:38:20 crc kubenswrapper[4867]: I1006 14:38:20.144031 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m87hx" event={"ID":"833b8594-3a46-414b-9496-2f02971a2d0b","Type":"ContainerStarted","Data":"94e3409c3ce1e6c99b027ee5a8fe08375aadaa547f4e22bbe1bc5ff8e7e44727"} Oct 06 14:38:20 crc kubenswrapper[4867]: I1006 14:38:20.173468 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m87hx" podStartSLOduration=3.641843432 podStartE2EDuration="7.17343519s" podCreationTimestamp="2025-10-06 14:38:13 +0000 UTC" firstStartedPulling="2025-10-06 14:38:16.095398051 +0000 UTC m=+5675.553346215" lastFinishedPulling="2025-10-06 14:38:19.626989829 +0000 UTC m=+5679.084937973" observedRunningTime="2025-10-06 14:38:20.163558345 +0000 UTC m=+5679.621506509" watchObservedRunningTime="2025-10-06 14:38:20.17343519 +0000 UTC m=+5679.631383354" Oct 06 14:38:20 crc kubenswrapper[4867]: I1006 14:38:20.828412 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:20 crc kubenswrapper[4867]: I1006 14:38:20.828494 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:20 crc kubenswrapper[4867]: I1006 14:38:20.894114 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:21 crc kubenswrapper[4867]: I1006 14:38:21.219087 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:22 crc kubenswrapper[4867]: I1006 14:38:22.486088 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t77p5"] Oct 06 14:38:23 crc kubenswrapper[4867]: I1006 14:38:23.173192 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t77p5" podUID="e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" containerName="registry-server" containerID="cri-o://2fcd25e9bf63cdaabfa213c19adb1f3cd4a2603f92b20ed67fa0b47102f0d2bc" gracePeriod=2 Oct 06 14:38:23 crc kubenswrapper[4867]: I1006 14:38:23.715292 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:23 crc kubenswrapper[4867]: I1006 14:38:23.821805 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-catalog-content\") pod \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\" (UID: \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\") " Oct 06 14:38:23 crc kubenswrapper[4867]: I1006 14:38:23.822264 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg6h4\" (UniqueName: \"kubernetes.io/projected/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-kube-api-access-sg6h4\") pod \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\" (UID: \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\") " Oct 06 14:38:23 crc kubenswrapper[4867]: I1006 14:38:23.822466 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-utilities\") pod \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\" (UID: \"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c\") " Oct 06 14:38:23 crc kubenswrapper[4867]: I1006 14:38:23.825243 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-utilities" (OuterVolumeSpecName: "utilities") pod "e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" (UID: "e0e7e5af-8775-40a9-a5dd-d178b30c6b8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:38:23 crc kubenswrapper[4867]: I1006 14:38:23.851110 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-kube-api-access-sg6h4" (OuterVolumeSpecName: "kube-api-access-sg6h4") pod "e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" (UID: "e0e7e5af-8775-40a9-a5dd-d178b30c6b8c"). InnerVolumeSpecName "kube-api-access-sg6h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:38:23 crc kubenswrapper[4867]: I1006 14:38:23.896915 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" (UID: "e0e7e5af-8775-40a9-a5dd-d178b30c6b8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:38:23 crc kubenswrapper[4867]: I1006 14:38:23.925428 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:23 crc kubenswrapper[4867]: I1006 14:38:23.925471 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:23 crc kubenswrapper[4867]: I1006 14:38:23.925512 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg6h4\" (UniqueName: \"kubernetes.io/projected/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c-kube-api-access-sg6h4\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.187727 4867 generic.go:334] "Generic (PLEG): container finished" podID="e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" containerID="2fcd25e9bf63cdaabfa213c19adb1f3cd4a2603f92b20ed67fa0b47102f0d2bc" exitCode=0 Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.187816 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t77p5" event={"ID":"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c","Type":"ContainerDied","Data":"2fcd25e9bf63cdaabfa213c19adb1f3cd4a2603f92b20ed67fa0b47102f0d2bc"} Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.187867 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t77p5" event={"ID":"e0e7e5af-8775-40a9-a5dd-d178b30c6b8c","Type":"ContainerDied","Data":"fc681790ad716d3af667c7ab272cda36395d035adfdcdd187dc07bf3bedea6ba"} Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.187890 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t77p5" Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.187903 4867 scope.go:117] "RemoveContainer" containerID="2fcd25e9bf63cdaabfa213c19adb1f3cd4a2603f92b20ed67fa0b47102f0d2bc" Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.218741 4867 scope.go:117] "RemoveContainer" containerID="85ea8a1df670826f6c75544889918704b33a305bba817e44d75490179ca59124" Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.226329 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t77p5"] Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.236787 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t77p5"] Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.256620 4867 scope.go:117] "RemoveContainer" containerID="0ec8bda9ba28468b8f2e8f6ceb0795f3f0dbc3397ea50056140b1308690af88f" Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.267331 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.267394 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.316997 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.351373 4867 scope.go:117] "RemoveContainer" containerID="2fcd25e9bf63cdaabfa213c19adb1f3cd4a2603f92b20ed67fa0b47102f0d2bc" Oct 06 14:38:24 crc kubenswrapper[4867]: E1006 14:38:24.352176 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fcd25e9bf63cdaabfa213c19adb1f3cd4a2603f92b20ed67fa0b47102f0d2bc\": container with ID starting with 2fcd25e9bf63cdaabfa213c19adb1f3cd4a2603f92b20ed67fa0b47102f0d2bc not found: ID does not exist" containerID="2fcd25e9bf63cdaabfa213c19adb1f3cd4a2603f92b20ed67fa0b47102f0d2bc" Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.352271 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fcd25e9bf63cdaabfa213c19adb1f3cd4a2603f92b20ed67fa0b47102f0d2bc"} err="failed to get container status \"2fcd25e9bf63cdaabfa213c19adb1f3cd4a2603f92b20ed67fa0b47102f0d2bc\": rpc error: code = NotFound desc = could not find container \"2fcd25e9bf63cdaabfa213c19adb1f3cd4a2603f92b20ed67fa0b47102f0d2bc\": container with ID starting with 2fcd25e9bf63cdaabfa213c19adb1f3cd4a2603f92b20ed67fa0b47102f0d2bc not found: ID does not exist" Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.352316 4867 scope.go:117] "RemoveContainer" containerID="85ea8a1df670826f6c75544889918704b33a305bba817e44d75490179ca59124" Oct 06 14:38:24 crc kubenswrapper[4867]: E1006 14:38:24.353354 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ea8a1df670826f6c75544889918704b33a305bba817e44d75490179ca59124\": container with ID starting with 85ea8a1df670826f6c75544889918704b33a305bba817e44d75490179ca59124 not found: ID does not exist" containerID="85ea8a1df670826f6c75544889918704b33a305bba817e44d75490179ca59124" Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.353396 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ea8a1df670826f6c75544889918704b33a305bba817e44d75490179ca59124"} err="failed to get container status \"85ea8a1df670826f6c75544889918704b33a305bba817e44d75490179ca59124\": rpc error: code = NotFound desc = could not find container \"85ea8a1df670826f6c75544889918704b33a305bba817e44d75490179ca59124\": container with ID starting with 85ea8a1df670826f6c75544889918704b33a305bba817e44d75490179ca59124 not found: ID does not exist" Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.353417 4867 scope.go:117] "RemoveContainer" containerID="0ec8bda9ba28468b8f2e8f6ceb0795f3f0dbc3397ea50056140b1308690af88f" Oct 06 14:38:24 crc kubenswrapper[4867]: E1006 14:38:24.353824 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ec8bda9ba28468b8f2e8f6ceb0795f3f0dbc3397ea50056140b1308690af88f\": container with ID starting with 0ec8bda9ba28468b8f2e8f6ceb0795f3f0dbc3397ea50056140b1308690af88f not found: ID does not exist" containerID="0ec8bda9ba28468b8f2e8f6ceb0795f3f0dbc3397ea50056140b1308690af88f" Oct 06 14:38:24 crc kubenswrapper[4867]: I1006 14:38:24.353852 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec8bda9ba28468b8f2e8f6ceb0795f3f0dbc3397ea50056140b1308690af88f"} err="failed to get container status \"0ec8bda9ba28468b8f2e8f6ceb0795f3f0dbc3397ea50056140b1308690af88f\": rpc error: code = NotFound desc = could not find container \"0ec8bda9ba28468b8f2e8f6ceb0795f3f0dbc3397ea50056140b1308690af88f\": container with ID starting with 0ec8bda9ba28468b8f2e8f6ceb0795f3f0dbc3397ea50056140b1308690af88f not found: ID does not exist" Oct 06 14:38:25 crc kubenswrapper[4867]: I1006 14:38:25.238890 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" path="/var/lib/kubelet/pods/e0e7e5af-8775-40a9-a5dd-d178b30c6b8c/volumes" Oct 06 14:38:25 crc kubenswrapper[4867]: I1006 14:38:25.267208 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:26 crc kubenswrapper[4867]: I1006 14:38:26.682132 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m87hx"] Oct 06 14:38:27 crc kubenswrapper[4867]: I1006 14:38:27.222023 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m87hx" podUID="833b8594-3a46-414b-9496-2f02971a2d0b" containerName="registry-server" containerID="cri-o://94e3409c3ce1e6c99b027ee5a8fe08375aadaa547f4e22bbe1bc5ff8e7e44727" gracePeriod=2 Oct 06 14:38:27 crc kubenswrapper[4867]: I1006 14:38:27.774620 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:27 crc kubenswrapper[4867]: I1006 14:38:27.917445 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/833b8594-3a46-414b-9496-2f02971a2d0b-utilities\") pod \"833b8594-3a46-414b-9496-2f02971a2d0b\" (UID: \"833b8594-3a46-414b-9496-2f02971a2d0b\") " Oct 06 14:38:27 crc kubenswrapper[4867]: I1006 14:38:27.917918 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/833b8594-3a46-414b-9496-2f02971a2d0b-catalog-content\") pod \"833b8594-3a46-414b-9496-2f02971a2d0b\" (UID: \"833b8594-3a46-414b-9496-2f02971a2d0b\") " Oct 06 14:38:27 crc kubenswrapper[4867]: I1006 14:38:27.918015 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmbgm\" (UniqueName: \"kubernetes.io/projected/833b8594-3a46-414b-9496-2f02971a2d0b-kube-api-access-jmbgm\") pod \"833b8594-3a46-414b-9496-2f02971a2d0b\" (UID: \"833b8594-3a46-414b-9496-2f02971a2d0b\") " Oct 06 14:38:27 crc kubenswrapper[4867]: I1006 14:38:27.919331 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/833b8594-3a46-414b-9496-2f02971a2d0b-utilities" (OuterVolumeSpecName: "utilities") pod "833b8594-3a46-414b-9496-2f02971a2d0b" (UID: "833b8594-3a46-414b-9496-2f02971a2d0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:38:27 crc kubenswrapper[4867]: I1006 14:38:27.927744 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/833b8594-3a46-414b-9496-2f02971a2d0b-kube-api-access-jmbgm" (OuterVolumeSpecName: "kube-api-access-jmbgm") pod "833b8594-3a46-414b-9496-2f02971a2d0b" (UID: "833b8594-3a46-414b-9496-2f02971a2d0b"). InnerVolumeSpecName "kube-api-access-jmbgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:38:27 crc kubenswrapper[4867]: I1006 14:38:27.977240 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/833b8594-3a46-414b-9496-2f02971a2d0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "833b8594-3a46-414b-9496-2f02971a2d0b" (UID: "833b8594-3a46-414b-9496-2f02971a2d0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.020876 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/833b8594-3a46-414b-9496-2f02971a2d0b-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.020912 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/833b8594-3a46-414b-9496-2f02971a2d0b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.020925 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmbgm\" (UniqueName: \"kubernetes.io/projected/833b8594-3a46-414b-9496-2f02971a2d0b-kube-api-access-jmbgm\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.237558 4867 generic.go:334] "Generic (PLEG): container finished" podID="833b8594-3a46-414b-9496-2f02971a2d0b" containerID="94e3409c3ce1e6c99b027ee5a8fe08375aadaa547f4e22bbe1bc5ff8e7e44727" exitCode=0 Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.237642 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m87hx" Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.237639 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m87hx" event={"ID":"833b8594-3a46-414b-9496-2f02971a2d0b","Type":"ContainerDied","Data":"94e3409c3ce1e6c99b027ee5a8fe08375aadaa547f4e22bbe1bc5ff8e7e44727"} Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.238602 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m87hx" event={"ID":"833b8594-3a46-414b-9496-2f02971a2d0b","Type":"ContainerDied","Data":"5f6d376fd1a02bd5f86a8b0d7ac3dad5286c2504173727a63fc9f58ccbefec10"} Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.238634 4867 scope.go:117] "RemoveContainer" containerID="94e3409c3ce1e6c99b027ee5a8fe08375aadaa547f4e22bbe1bc5ff8e7e44727" Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.274578 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m87hx"] Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.279503 4867 scope.go:117] "RemoveContainer" containerID="104f532982fd59a9303b3802db4214ff26211f0e5e6e5f629d5d861882b75401" Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.284242 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m87hx"] Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.303675 4867 scope.go:117] "RemoveContainer" containerID="43ebb4d7e1c7b425c96a31f74a749679ae7b11ef730f402e48dcabe0a166fcfd" Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.359914 4867 scope.go:117] "RemoveContainer" containerID="94e3409c3ce1e6c99b027ee5a8fe08375aadaa547f4e22bbe1bc5ff8e7e44727" Oct 06 14:38:28 crc kubenswrapper[4867]: E1006 14:38:28.360661 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e3409c3ce1e6c99b027ee5a8fe08375aadaa547f4e22bbe1bc5ff8e7e44727\": container with ID starting with 94e3409c3ce1e6c99b027ee5a8fe08375aadaa547f4e22bbe1bc5ff8e7e44727 not found: ID does not exist" containerID="94e3409c3ce1e6c99b027ee5a8fe08375aadaa547f4e22bbe1bc5ff8e7e44727" Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.360697 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e3409c3ce1e6c99b027ee5a8fe08375aadaa547f4e22bbe1bc5ff8e7e44727"} err="failed to get container status \"94e3409c3ce1e6c99b027ee5a8fe08375aadaa547f4e22bbe1bc5ff8e7e44727\": rpc error: code = NotFound desc = could not find container \"94e3409c3ce1e6c99b027ee5a8fe08375aadaa547f4e22bbe1bc5ff8e7e44727\": container with ID starting with 94e3409c3ce1e6c99b027ee5a8fe08375aadaa547f4e22bbe1bc5ff8e7e44727 not found: ID does not exist" Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.360726 4867 scope.go:117] "RemoveContainer" containerID="104f532982fd59a9303b3802db4214ff26211f0e5e6e5f629d5d861882b75401" Oct 06 14:38:28 crc kubenswrapper[4867]: E1006 14:38:28.361425 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"104f532982fd59a9303b3802db4214ff26211f0e5e6e5f629d5d861882b75401\": container with ID starting with 104f532982fd59a9303b3802db4214ff26211f0e5e6e5f629d5d861882b75401 not found: ID does not exist" containerID="104f532982fd59a9303b3802db4214ff26211f0e5e6e5f629d5d861882b75401" Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.361461 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104f532982fd59a9303b3802db4214ff26211f0e5e6e5f629d5d861882b75401"} err="failed to get container status \"104f532982fd59a9303b3802db4214ff26211f0e5e6e5f629d5d861882b75401\": rpc error: code = NotFound desc = could not find container \"104f532982fd59a9303b3802db4214ff26211f0e5e6e5f629d5d861882b75401\": container with ID starting with 104f532982fd59a9303b3802db4214ff26211f0e5e6e5f629d5d861882b75401 not found: ID does not exist" Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.361476 4867 scope.go:117] "RemoveContainer" containerID="43ebb4d7e1c7b425c96a31f74a749679ae7b11ef730f402e48dcabe0a166fcfd" Oct 06 14:38:28 crc kubenswrapper[4867]: E1006 14:38:28.361739 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ebb4d7e1c7b425c96a31f74a749679ae7b11ef730f402e48dcabe0a166fcfd\": container with ID starting with 43ebb4d7e1c7b425c96a31f74a749679ae7b11ef730f402e48dcabe0a166fcfd not found: ID does not exist" containerID="43ebb4d7e1c7b425c96a31f74a749679ae7b11ef730f402e48dcabe0a166fcfd" Oct 06 14:38:28 crc kubenswrapper[4867]: I1006 14:38:28.361802 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ebb4d7e1c7b425c96a31f74a749679ae7b11ef730f402e48dcabe0a166fcfd"} err="failed to get container status \"43ebb4d7e1c7b425c96a31f74a749679ae7b11ef730f402e48dcabe0a166fcfd\": rpc error: code = NotFound desc = could not find container \"43ebb4d7e1c7b425c96a31f74a749679ae7b11ef730f402e48dcabe0a166fcfd\": container with ID starting with 43ebb4d7e1c7b425c96a31f74a749679ae7b11ef730f402e48dcabe0a166fcfd not found: ID does not exist" Oct 06 14:38:29 crc kubenswrapper[4867]: I1006 14:38:29.231806 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="833b8594-3a46-414b-9496-2f02971a2d0b" path="/var/lib/kubelet/pods/833b8594-3a46-414b-9496-2f02971a2d0b/volumes" Oct 06 14:38:40 crc kubenswrapper[4867]: I1006 14:38:40.354649 4867 generic.go:334] "Generic (PLEG): container finished" podID="43684055-87e6-4568-8a80-8019600aaeef" containerID="80cda36f8946522fe0e1cac3986cc232aee88e99304fb948bc0d9a610cc4f637" exitCode=0 Oct 06 14:38:40 crc kubenswrapper[4867]: I1006 14:38:40.354811 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"43684055-87e6-4568-8a80-8019600aaeef","Type":"ContainerDied","Data":"80cda36f8946522fe0e1cac3986cc232aee88e99304fb948bc0d9a610cc4f637"} Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.711982 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.788246 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/43684055-87e6-4568-8a80-8019600aaeef-openstack-config\") pod \"43684055-87e6-4568-8a80-8019600aaeef\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.788320 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-ca-certs\") pod \"43684055-87e6-4568-8a80-8019600aaeef\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.788363 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/43684055-87e6-4568-8a80-8019600aaeef-test-operator-ephemeral-temporary\") pod \"43684055-87e6-4568-8a80-8019600aaeef\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.788445 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g778f\" (UniqueName: \"kubernetes.io/projected/43684055-87e6-4568-8a80-8019600aaeef-kube-api-access-g778f\") pod \"43684055-87e6-4568-8a80-8019600aaeef\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.788466 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-ssh-key\") pod \"43684055-87e6-4568-8a80-8019600aaeef\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.788498 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43684055-87e6-4568-8a80-8019600aaeef-config-data\") pod \"43684055-87e6-4568-8a80-8019600aaeef\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.788598 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-openstack-config-secret\") pod \"43684055-87e6-4568-8a80-8019600aaeef\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.788685 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"43684055-87e6-4568-8a80-8019600aaeef\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.788717 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/43684055-87e6-4568-8a80-8019600aaeef-test-operator-ephemeral-workdir\") pod \"43684055-87e6-4568-8a80-8019600aaeef\" (UID: \"43684055-87e6-4568-8a80-8019600aaeef\") " Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.789885 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43684055-87e6-4568-8a80-8019600aaeef-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "43684055-87e6-4568-8a80-8019600aaeef" (UID: "43684055-87e6-4568-8a80-8019600aaeef"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.790042 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43684055-87e6-4568-8a80-8019600aaeef-config-data" (OuterVolumeSpecName: "config-data") pod "43684055-87e6-4568-8a80-8019600aaeef" (UID: "43684055-87e6-4568-8a80-8019600aaeef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.794175 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "test-operator-logs") pod "43684055-87e6-4568-8a80-8019600aaeef" (UID: "43684055-87e6-4568-8a80-8019600aaeef"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.794800 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43684055-87e6-4568-8a80-8019600aaeef-kube-api-access-g778f" (OuterVolumeSpecName: "kube-api-access-g778f") pod "43684055-87e6-4568-8a80-8019600aaeef" (UID: "43684055-87e6-4568-8a80-8019600aaeef"). InnerVolumeSpecName "kube-api-access-g778f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.796201 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43684055-87e6-4568-8a80-8019600aaeef-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "43684055-87e6-4568-8a80-8019600aaeef" (UID: "43684055-87e6-4568-8a80-8019600aaeef"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.818457 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "43684055-87e6-4568-8a80-8019600aaeef" (UID: "43684055-87e6-4568-8a80-8019600aaeef"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.819269 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "43684055-87e6-4568-8a80-8019600aaeef" (UID: "43684055-87e6-4568-8a80-8019600aaeef"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.819401 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "43684055-87e6-4568-8a80-8019600aaeef" (UID: "43684055-87e6-4568-8a80-8019600aaeef"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.841209 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43684055-87e6-4568-8a80-8019600aaeef-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "43684055-87e6-4568-8a80-8019600aaeef" (UID: "43684055-87e6-4568-8a80-8019600aaeef"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.890781 4867 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/43684055-87e6-4568-8a80-8019600aaeef-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.890850 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g778f\" (UniqueName: \"kubernetes.io/projected/43684055-87e6-4568-8a80-8019600aaeef-kube-api-access-g778f\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.890862 4867 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.890874 4867 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43684055-87e6-4568-8a80-8019600aaeef-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.890885 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.890920 4867 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.890929 4867 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/43684055-87e6-4568-8a80-8019600aaeef-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.890938 4867 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/43684055-87e6-4568-8a80-8019600aaeef-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.890947 4867 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/43684055-87e6-4568-8a80-8019600aaeef-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.912628 4867 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 06 14:38:41 crc kubenswrapper[4867]: I1006 14:38:41.993009 4867 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 06 14:38:42 crc kubenswrapper[4867]: I1006 14:38:42.374738 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"43684055-87e6-4568-8a80-8019600aaeef","Type":"ContainerDied","Data":"64137c40f37792976ecc2072fe48b50ea71728e06246f4964cb79f261125f077"} Oct 06 14:38:42 crc kubenswrapper[4867]: I1006 14:38:42.374812 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64137c40f37792976ecc2072fe48b50ea71728e06246f4964cb79f261125f077" Oct 06 14:38:42 crc kubenswrapper[4867]: I1006 14:38:42.374819 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.048326 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 06 14:38:51 crc kubenswrapper[4867]: E1006 14:38:51.049902 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43684055-87e6-4568-8a80-8019600aaeef" containerName="tempest-tests-tempest-tests-runner" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.049923 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="43684055-87e6-4568-8a80-8019600aaeef" containerName="tempest-tests-tempest-tests-runner" Oct 06 14:38:51 crc kubenswrapper[4867]: E1006 14:38:51.049944 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833b8594-3a46-414b-9496-2f02971a2d0b" containerName="registry-server" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.049953 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="833b8594-3a46-414b-9496-2f02971a2d0b" containerName="registry-server" Oct 06 14:38:51 crc kubenswrapper[4867]: E1006 14:38:51.049989 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833b8594-3a46-414b-9496-2f02971a2d0b" containerName="extract-content" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.049999 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="833b8594-3a46-414b-9496-2f02971a2d0b" containerName="extract-content" Oct 06 14:38:51 crc kubenswrapper[4867]: E1006 14:38:51.050031 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" containerName="extract-utilities" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.050041 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" containerName="extract-utilities" Oct 06 14:38:51 crc kubenswrapper[4867]: E1006 14:38:51.050066 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833b8594-3a46-414b-9496-2f02971a2d0b" containerName="extract-utilities" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.050075 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="833b8594-3a46-414b-9496-2f02971a2d0b" containerName="extract-utilities" Oct 06 14:38:51 crc kubenswrapper[4867]: E1006 14:38:51.050095 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" containerName="extract-content" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.050107 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" containerName="extract-content" Oct 06 14:38:51 crc kubenswrapper[4867]: E1006 14:38:51.050131 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" containerName="registry-server" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.050139 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" containerName="registry-server" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.050465 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="43684055-87e6-4568-8a80-8019600aaeef" containerName="tempest-tests-tempest-tests-runner" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.050504 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0e7e5af-8775-40a9-a5dd-d178b30c6b8c" containerName="registry-server" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.050521 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="833b8594-3a46-414b-9496-2f02971a2d0b" containerName="registry-server" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.051669 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.055219 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-n7mnk" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.075568 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.077413 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9b73e25d-d1ba-4829-948f-bba412f56404\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.078016 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjkhx\" (UniqueName: \"kubernetes.io/projected/9b73e25d-d1ba-4829-948f-bba412f56404-kube-api-access-vjkhx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9b73e25d-d1ba-4829-948f-bba412f56404\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.180684 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9b73e25d-d1ba-4829-948f-bba412f56404\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.180838 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjkhx\" (UniqueName: \"kubernetes.io/projected/9b73e25d-d1ba-4829-948f-bba412f56404-kube-api-access-vjkhx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9b73e25d-d1ba-4829-948f-bba412f56404\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.181814 4867 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9b73e25d-d1ba-4829-948f-bba412f56404\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.205539 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjkhx\" (UniqueName: \"kubernetes.io/projected/9b73e25d-d1ba-4829-948f-bba412f56404-kube-api-access-vjkhx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9b73e25d-d1ba-4829-948f-bba412f56404\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.236157 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9b73e25d-d1ba-4829-948f-bba412f56404\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.386809 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 06 14:38:51 crc kubenswrapper[4867]: I1006 14:38:51.924705 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 06 14:38:52 crc kubenswrapper[4867]: I1006 14:38:52.487528 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9b73e25d-d1ba-4829-948f-bba412f56404","Type":"ContainerStarted","Data":"de6196ea9989490b733c2d45fe7c0ac4458dead14fbcba09fff8d6b9e29f65d1"} Oct 06 14:38:53 crc kubenswrapper[4867]: I1006 14:38:53.501650 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9b73e25d-d1ba-4829-948f-bba412f56404","Type":"ContainerStarted","Data":"ecc825f598a20192bc91dc1344b25735f688b1ea810fd9f91450ffdb07e29f48"} Oct 06 14:38:53 crc kubenswrapper[4867]: I1006 14:38:53.519836 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.5346659470000001 podStartE2EDuration="2.519814821s" podCreationTimestamp="2025-10-06 14:38:51 +0000 UTC" firstStartedPulling="2025-10-06 14:38:51.931209619 +0000 UTC m=+5711.389157773" lastFinishedPulling="2025-10-06 14:38:52.916358503 +0000 UTC m=+5712.374306647" observedRunningTime="2025-10-06 14:38:53.516648986 +0000 UTC m=+5712.974597140" watchObservedRunningTime="2025-10-06 14:38:53.519814821 +0000 UTC m=+5712.977762965" Oct 06 14:39:10 crc kubenswrapper[4867]: I1006 14:39:10.823305 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t7lp9/must-gather-7rw7n"] Oct 06 14:39:10 crc kubenswrapper[4867]: I1006 14:39:10.827275 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/must-gather-7rw7n" Oct 06 14:39:10 crc kubenswrapper[4867]: I1006 14:39:10.830852 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t7lp9"/"openshift-service-ca.crt" Oct 06 14:39:10 crc kubenswrapper[4867]: I1006 14:39:10.831709 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t7lp9"/"kube-root-ca.crt" Oct 06 14:39:10 crc kubenswrapper[4867]: I1006 14:39:10.830910 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-t7lp9"/"default-dockercfg-btkvr" Oct 06 14:39:10 crc kubenswrapper[4867]: I1006 14:39:10.850208 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t7lp9/must-gather-7rw7n"] Oct 06 14:39:10 crc kubenswrapper[4867]: I1006 14:39:10.883543 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ac97278f-c2d6-4bb4-849d-7e2024d818bb-must-gather-output\") pod \"must-gather-7rw7n\" (UID: \"ac97278f-c2d6-4bb4-849d-7e2024d818bb\") " pod="openshift-must-gather-t7lp9/must-gather-7rw7n" Oct 06 14:39:10 crc kubenswrapper[4867]: I1006 14:39:10.883912 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpx6l\" (UniqueName: \"kubernetes.io/projected/ac97278f-c2d6-4bb4-849d-7e2024d818bb-kube-api-access-fpx6l\") pod \"must-gather-7rw7n\" (UID: \"ac97278f-c2d6-4bb4-849d-7e2024d818bb\") " pod="openshift-must-gather-t7lp9/must-gather-7rw7n" Oct 06 14:39:10 crc kubenswrapper[4867]: I1006 14:39:10.985504 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ac97278f-c2d6-4bb4-849d-7e2024d818bb-must-gather-output\") pod \"must-gather-7rw7n\" (UID: \"ac97278f-c2d6-4bb4-849d-7e2024d818bb\") " pod="openshift-must-gather-t7lp9/must-gather-7rw7n" Oct 06 14:39:10 crc kubenswrapper[4867]: I1006 14:39:10.985593 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpx6l\" (UniqueName: \"kubernetes.io/projected/ac97278f-c2d6-4bb4-849d-7e2024d818bb-kube-api-access-fpx6l\") pod \"must-gather-7rw7n\" (UID: \"ac97278f-c2d6-4bb4-849d-7e2024d818bb\") " pod="openshift-must-gather-t7lp9/must-gather-7rw7n" Oct 06 14:39:10 crc kubenswrapper[4867]: I1006 14:39:10.985971 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ac97278f-c2d6-4bb4-849d-7e2024d818bb-must-gather-output\") pod \"must-gather-7rw7n\" (UID: \"ac97278f-c2d6-4bb4-849d-7e2024d818bb\") " pod="openshift-must-gather-t7lp9/must-gather-7rw7n" Oct 06 14:39:11 crc kubenswrapper[4867]: I1006 14:39:11.006160 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpx6l\" (UniqueName: \"kubernetes.io/projected/ac97278f-c2d6-4bb4-849d-7e2024d818bb-kube-api-access-fpx6l\") pod \"must-gather-7rw7n\" (UID: \"ac97278f-c2d6-4bb4-849d-7e2024d818bb\") " pod="openshift-must-gather-t7lp9/must-gather-7rw7n" Oct 06 14:39:11 crc kubenswrapper[4867]: I1006 14:39:11.162295 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/must-gather-7rw7n" Oct 06 14:39:11 crc kubenswrapper[4867]: I1006 14:39:11.639770 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t7lp9/must-gather-7rw7n"] Oct 06 14:39:11 crc kubenswrapper[4867]: I1006 14:39:11.707635 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t7lp9/must-gather-7rw7n" event={"ID":"ac97278f-c2d6-4bb4-849d-7e2024d818bb","Type":"ContainerStarted","Data":"73359bd1d333402e90540db238e9dde73ae838d71b07a6ff2c4fb98ecda32052"} Oct 06 14:39:12 crc kubenswrapper[4867]: I1006 14:39:12.873665 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:39:12 crc kubenswrapper[4867]: I1006 14:39:12.874081 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:39:18 crc kubenswrapper[4867]: I1006 14:39:18.818086 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t7lp9/must-gather-7rw7n" event={"ID":"ac97278f-c2d6-4bb4-849d-7e2024d818bb","Type":"ContainerStarted","Data":"ee292c5efc01c94ca7c254211d4a9fb23fdf6a0958a5ea61dec194b1d131b213"} Oct 06 14:39:18 crc kubenswrapper[4867]: I1006 14:39:18.818933 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t7lp9/must-gather-7rw7n" event={"ID":"ac97278f-c2d6-4bb4-849d-7e2024d818bb","Type":"ContainerStarted","Data":"9f77c77d9961795d391b57ee71fdc40d16ffd45467d81836b33591ced95c58df"} Oct 06 14:39:18 crc kubenswrapper[4867]: I1006 14:39:18.842910 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t7lp9/must-gather-7rw7n" podStartSLOduration=2.484166591 podStartE2EDuration="8.842888083s" podCreationTimestamp="2025-10-06 14:39:10 +0000 UTC" firstStartedPulling="2025-10-06 14:39:11.640091856 +0000 UTC m=+5731.098040000" lastFinishedPulling="2025-10-06 14:39:17.998813348 +0000 UTC m=+5737.456761492" observedRunningTime="2025-10-06 14:39:18.833923793 +0000 UTC m=+5738.291871937" watchObservedRunningTime="2025-10-06 14:39:18.842888083 +0000 UTC m=+5738.300836227" Oct 06 14:39:22 crc kubenswrapper[4867]: I1006 14:39:22.600157 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t7lp9/crc-debug-4blgh"] Oct 06 14:39:22 crc kubenswrapper[4867]: I1006 14:39:22.604565 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/crc-debug-4blgh" Oct 06 14:39:22 crc kubenswrapper[4867]: I1006 14:39:22.658013 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qgck\" (UniqueName: \"kubernetes.io/projected/b8565725-0d1b-49a0-a469-3b20311e5f7d-kube-api-access-7qgck\") pod \"crc-debug-4blgh\" (UID: \"b8565725-0d1b-49a0-a469-3b20311e5f7d\") " pod="openshift-must-gather-t7lp9/crc-debug-4blgh" Oct 06 14:39:22 crc kubenswrapper[4867]: I1006 14:39:22.658354 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8565725-0d1b-49a0-a469-3b20311e5f7d-host\") pod \"crc-debug-4blgh\" (UID: \"b8565725-0d1b-49a0-a469-3b20311e5f7d\") " pod="openshift-must-gather-t7lp9/crc-debug-4blgh" Oct 06 14:39:22 crc kubenswrapper[4867]: I1006 14:39:22.761349 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8565725-0d1b-49a0-a469-3b20311e5f7d-host\") pod \"crc-debug-4blgh\" (UID: \"b8565725-0d1b-49a0-a469-3b20311e5f7d\") " pod="openshift-must-gather-t7lp9/crc-debug-4blgh" Oct 06 14:39:22 crc kubenswrapper[4867]: I1006 14:39:22.761222 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8565725-0d1b-49a0-a469-3b20311e5f7d-host\") pod \"crc-debug-4blgh\" (UID: \"b8565725-0d1b-49a0-a469-3b20311e5f7d\") " pod="openshift-must-gather-t7lp9/crc-debug-4blgh" Oct 06 14:39:22 crc kubenswrapper[4867]: I1006 14:39:22.761608 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qgck\" (UniqueName: \"kubernetes.io/projected/b8565725-0d1b-49a0-a469-3b20311e5f7d-kube-api-access-7qgck\") pod \"crc-debug-4blgh\" (UID: \"b8565725-0d1b-49a0-a469-3b20311e5f7d\") " pod="openshift-must-gather-t7lp9/crc-debug-4blgh" Oct 06 14:39:22 crc kubenswrapper[4867]: I1006 14:39:22.792892 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qgck\" (UniqueName: \"kubernetes.io/projected/b8565725-0d1b-49a0-a469-3b20311e5f7d-kube-api-access-7qgck\") pod \"crc-debug-4blgh\" (UID: \"b8565725-0d1b-49a0-a469-3b20311e5f7d\") " pod="openshift-must-gather-t7lp9/crc-debug-4blgh" Oct 06 14:39:22 crc kubenswrapper[4867]: I1006 14:39:22.926780 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/crc-debug-4blgh" Oct 06 14:39:22 crc kubenswrapper[4867]: W1006 14:39:22.964212 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8565725_0d1b_49a0_a469_3b20311e5f7d.slice/crio-5b83261b5c975d0fdc3487f50960784a9959f2e5e1ea3277f58cea22196c5cc2 WatchSource:0}: Error finding container 5b83261b5c975d0fdc3487f50960784a9959f2e5e1ea3277f58cea22196c5cc2: Status 404 returned error can't find the container with id 5b83261b5c975d0fdc3487f50960784a9959f2e5e1ea3277f58cea22196c5cc2 Oct 06 14:39:23 crc kubenswrapper[4867]: I1006 14:39:23.870068 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t7lp9/crc-debug-4blgh" event={"ID":"b8565725-0d1b-49a0-a469-3b20311e5f7d","Type":"ContainerStarted","Data":"5b83261b5c975d0fdc3487f50960784a9959f2e5e1ea3277f58cea22196c5cc2"} Oct 06 14:39:33 crc kubenswrapper[4867]: I1006 14:39:33.997823 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t7lp9/crc-debug-4blgh" event={"ID":"b8565725-0d1b-49a0-a469-3b20311e5f7d","Type":"ContainerStarted","Data":"7b394c8e51551c7e87c17d42d0a568a6a6e23b64986c0fa2bef42449bf5e2302"} Oct 06 14:39:34 crc kubenswrapper[4867]: I1006 14:39:34.022678 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t7lp9/crc-debug-4blgh" podStartSLOduration=1.562596644 podStartE2EDuration="12.022648188s" podCreationTimestamp="2025-10-06 14:39:22 +0000 UTC" firstStartedPulling="2025-10-06 14:39:22.967362466 +0000 UTC m=+5742.425310610" lastFinishedPulling="2025-10-06 14:39:33.42741402 +0000 UTC m=+5752.885362154" observedRunningTime="2025-10-06 14:39:34.011239002 +0000 UTC m=+5753.469187136" watchObservedRunningTime="2025-10-06 14:39:34.022648188 +0000 UTC m=+5753.480596332" Oct 06 14:39:42 crc kubenswrapper[4867]: I1006 14:39:42.873420 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:39:42 crc kubenswrapper[4867]: I1006 14:39:42.874103 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:40:12 crc kubenswrapper[4867]: I1006 14:40:12.873740 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:40:12 crc kubenswrapper[4867]: I1006 14:40:12.874284 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:40:12 crc kubenswrapper[4867]: I1006 14:40:12.874336 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 14:40:12 crc kubenswrapper[4867]: I1006 14:40:12.875433 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9da5b3ec3f12078011440e5be465b37e5aa4d49c61b498acf56e70c5ba427e7"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 14:40:12 crc kubenswrapper[4867]: I1006 14:40:12.875508 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://f9da5b3ec3f12078011440e5be465b37e5aa4d49c61b498acf56e70c5ba427e7" gracePeriod=600 Oct 06 14:40:13 crc kubenswrapper[4867]: I1006 14:40:13.457140 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="f9da5b3ec3f12078011440e5be465b37e5aa4d49c61b498acf56e70c5ba427e7" exitCode=0 Oct 06 14:40:13 crc kubenswrapper[4867]: I1006 14:40:13.457304 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"f9da5b3ec3f12078011440e5be465b37e5aa4d49c61b498acf56e70c5ba427e7"} Oct 06 14:40:13 crc kubenswrapper[4867]: I1006 14:40:13.458128 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b"} Oct 06 14:40:13 crc kubenswrapper[4867]: I1006 14:40:13.458164 4867 scope.go:117] "RemoveContainer" containerID="8f50dde742fc0d9a083826be3b50b0140def3dde3aa775c4708b7404a40ea068" Oct 06 14:40:42 crc kubenswrapper[4867]: I1006 14:40:42.531230 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-86d4db6f74-khhjk_a72dc3e7-d107-4153-9ce3-b092369b5d66/barbican-api-log/0.log" Oct 06 14:40:42 crc kubenswrapper[4867]: I1006 14:40:42.557318 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-86d4db6f74-khhjk_a72dc3e7-d107-4153-9ce3-b092369b5d66/barbican-api/0.log" Oct 06 14:40:42 crc kubenswrapper[4867]: I1006 14:40:42.737089 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f64dc9ddb-5ltwp_cdf0758d-d2e6-4660-8b3b-677c5febec8f/barbican-keystone-listener/0.log" Oct 06 14:40:42 crc kubenswrapper[4867]: I1006 14:40:42.846657 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f64dc9ddb-5ltwp_cdf0758d-d2e6-4660-8b3b-677c5febec8f/barbican-keystone-listener-log/0.log" Oct 06 14:40:43 crc kubenswrapper[4867]: I1006 14:40:43.010892 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5fc4fffc87-p6rts_f6057ffc-7d15-4097-b9d2-677fa9e69920/barbican-worker/0.log" Oct 06 14:40:43 crc kubenswrapper[4867]: I1006 14:40:43.055345 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5fc4fffc87-p6rts_f6057ffc-7d15-4097-b9d2-677fa9e69920/barbican-worker-log/0.log" Oct 06 14:40:43 crc kubenswrapper[4867]: I1006 14:40:43.297445 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9_e7ba5c1b-0dcb-4509-bb81-4bda347944bf/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:43 crc kubenswrapper[4867]: I1006 14:40:43.554189 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5383175c-d1e2-4f75-a9b6-5986e5062009/ceilometer-notification-agent/0.log" Oct 06 14:40:43 crc kubenswrapper[4867]: I1006 14:40:43.589304 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5383175c-d1e2-4f75-a9b6-5986e5062009/proxy-httpd/0.log" Oct 06 14:40:43 crc kubenswrapper[4867]: I1006 14:40:43.613009 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5383175c-d1e2-4f75-a9b6-5986e5062009/ceilometer-central-agent/0.log" Oct 06 14:40:43 crc kubenswrapper[4867]: I1006 14:40:43.739499 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5383175c-d1e2-4f75-a9b6-5986e5062009/sg-core/0.log" Oct 06 14:40:44 crc kubenswrapper[4867]: I1006 14:40:44.040433 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_678b77f0-1e51-4788-a7e0-4bc2560a9c6a/cinder-api-log/0.log" Oct 06 14:40:44 crc kubenswrapper[4867]: I1006 14:40:44.069980 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_678b77f0-1e51-4788-a7e0-4bc2560a9c6a/cinder-api/0.log" Oct 06 14:40:44 crc kubenswrapper[4867]: I1006 14:40:44.275328 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a3850291-2d24-472c-9ef2-7f2814c4c321/cinder-scheduler/0.log" Oct 06 14:40:44 crc kubenswrapper[4867]: I1006 14:40:44.332830 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a3850291-2d24-472c-9ef2-7f2814c4c321/probe/0.log" Oct 06 14:40:44 crc kubenswrapper[4867]: I1006 14:40:44.493826 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg_68b10f2c-285a-4492-90c0-1a3d83ab46e7/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:44 crc kubenswrapper[4867]: I1006 14:40:44.649840 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7zljl_cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:44 crc kubenswrapper[4867]: I1006 14:40:44.822359 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9_8b54a7c9-430b-4dfc-9ffb-ae3c790372ee/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:45 crc kubenswrapper[4867]: I1006 14:40:45.032991 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c48bdb645-wtbz6_662008a2-cb52-48d6-bd6e-7e1c9bd511cf/init/0.log" Oct 06 14:40:45 crc kubenswrapper[4867]: I1006 14:40:45.201774 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c48bdb645-wtbz6_662008a2-cb52-48d6-bd6e-7e1c9bd511cf/init/0.log" Oct 06 14:40:45 crc kubenswrapper[4867]: I1006 14:40:45.389301 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c48bdb645-wtbz6_662008a2-cb52-48d6-bd6e-7e1c9bd511cf/dnsmasq-dns/0.log" Oct 06 14:40:45 crc kubenswrapper[4867]: I1006 14:40:45.445665 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5_b214a0d6-e528-435a-9126-04d18492d264/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:45 crc kubenswrapper[4867]: I1006 14:40:45.586348 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fbbe30dd-179b-4e2b-b011-b395c30e32a9/glance-httpd/0.log" Oct 06 14:40:45 crc kubenswrapper[4867]: I1006 14:40:45.628775 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fbbe30dd-179b-4e2b-b011-b395c30e32a9/glance-log/0.log" Oct 06 14:40:45 crc kubenswrapper[4867]: I1006 14:40:45.763755 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_fe3366d3-09d4-49fb-a388-3291fe1e65b0/glance-httpd/0.log" Oct 06 14:40:45 crc kubenswrapper[4867]: I1006 14:40:45.851991 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_fe3366d3-09d4-49fb-a388-3291fe1e65b0/glance-log/0.log" Oct 06 14:40:45 crc kubenswrapper[4867]: I1006 14:40:45.999398 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-69d5cf7ffb-c2rgt_d7e92d5c-74ed-47bc-995a-d3712014f109/horizon/0.log" Oct 06 14:40:46 crc kubenswrapper[4867]: I1006 14:40:46.192600 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf_4c656316-c726-4675-9209-cf119811bc63/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:46 crc kubenswrapper[4867]: I1006 14:40:46.401825 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-qbxjs_828fbe77-2fb8-4ed5-b64f-733c1dad834d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:46 crc kubenswrapper[4867]: I1006 14:40:46.594974 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29329321-p9bh2_ceb3352a-f644-4721-9e76-8c27cb9e26ac/keystone-cron/0.log" Oct 06 14:40:46 crc kubenswrapper[4867]: I1006 14:40:46.754454 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-69d5cf7ffb-c2rgt_d7e92d5c-74ed-47bc-995a-d3712014f109/horizon-log/0.log" Oct 06 14:40:46 crc kubenswrapper[4867]: I1006 14:40:46.988956 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_feb72ccb-56bd-433d-b82c-6002fed1e09d/kube-state-metrics/0.log" Oct 06 14:40:47 crc kubenswrapper[4867]: I1006 14:40:47.165919 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9_019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:47 crc kubenswrapper[4867]: I1006 14:40:47.230744 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-dcf7c7d6f-dz9mk_dd2a23fb-89c2-4a8c-b670-3f8330f13265/keystone-api/0.log" Oct 06 14:40:47 crc kubenswrapper[4867]: I1006 14:40:47.948210 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c47455745-hd5zg_81a1b704-8648-453e-b052-9a2721cf9830/neutron-httpd/0.log" Oct 06 14:40:47 crc kubenswrapper[4867]: I1006 14:40:47.979497 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2_081b2d1c-3691-40fb-8fde-05e44428087d/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:47 crc kubenswrapper[4867]: I1006 14:40:47.983764 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c47455745-hd5zg_81a1b704-8648-453e-b052-9a2721cf9830/neutron-api/0.log" Oct 06 14:40:49 crc kubenswrapper[4867]: I1006 14:40:49.087395 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a1ce9788-66bb-464a-8cb4-a28f43e4228f/nova-cell0-conductor-conductor/0.log" Oct 06 14:40:49 crc kubenswrapper[4867]: I1006 14:40:49.745346 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3668fba3-af0f-478b-a41b-5de304592f65/nova-cell1-conductor-conductor/0.log" Oct 06 14:40:50 crc kubenswrapper[4867]: I1006 14:40:50.157135 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_165ccfff-2554-4af2-8ca4-be0c49e7daa8/nova-api-log/0.log" Oct 06 14:40:50 crc kubenswrapper[4867]: I1006 14:40:50.418607 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_38e2bbc3-d543-4521-bc10-88635228f1a9/nova-cell1-novncproxy-novncproxy/0.log" Oct 06 14:40:50 crc kubenswrapper[4867]: I1006 14:40:50.487928 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_165ccfff-2554-4af2-8ca4-be0c49e7daa8/nova-api-api/0.log" Oct 06 14:40:50 crc kubenswrapper[4867]: I1006 14:40:50.500133 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_40e8af9c-90c3-4d15-b8c8-c7b35447bf17/memcached/0.log" Oct 06 14:40:50 crc kubenswrapper[4867]: I1006 14:40:50.665391 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jgjsw_33622851-83c0-48c9-969d-99f96fbcb64f/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:50 crc kubenswrapper[4867]: I1006 14:40:50.856222 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_299fc545-42f8-4889-8775-57b7aed64736/nova-metadata-log/0.log" Oct 06 14:40:51 crc kubenswrapper[4867]: I1006 14:40:51.246173 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_acd2b7ce-fe29-4b71-b730-7b1212f4416d/mysql-bootstrap/0.log" Oct 06 14:40:51 crc kubenswrapper[4867]: I1006 14:40:51.346858 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d90a63ca-3da5-420b-b2ac-b17f116f0c84/nova-scheduler-scheduler/0.log" Oct 06 14:40:51 crc kubenswrapper[4867]: I1006 14:40:51.418280 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_acd2b7ce-fe29-4b71-b730-7b1212f4416d/mysql-bootstrap/0.log" Oct 06 14:40:51 crc kubenswrapper[4867]: I1006 14:40:51.504814 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_acd2b7ce-fe29-4b71-b730-7b1212f4416d/galera/0.log" Oct 06 14:40:51 crc kubenswrapper[4867]: I1006 14:40:51.706714 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ec109351-f578-4141-8193-44f6433880b3/mysql-bootstrap/0.log" Oct 06 14:40:51 crc kubenswrapper[4867]: I1006 14:40:51.881554 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ec109351-f578-4141-8193-44f6433880b3/galera/0.log" Oct 06 14:40:51 crc kubenswrapper[4867]: I1006 14:40:51.884907 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ec109351-f578-4141-8193-44f6433880b3/mysql-bootstrap/0.log" Oct 06 14:40:52 crc kubenswrapper[4867]: I1006 14:40:52.095655 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b7620829-b468-470c-899e-92faea8bc3c7/openstackclient/0.log" Oct 06 14:40:52 crc kubenswrapper[4867]: I1006 14:40:52.242978 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6nfld_cbe16793-d6a8-4aa9-b509-3f3b710b70e3/openstack-network-exporter/0.log" Oct 06 14:40:52 crc kubenswrapper[4867]: I1006 14:40:52.411417 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k22cm_7478d336-9573-432c-8d73-f7396d652085/ovsdb-server-init/0.log" Oct 06 14:40:52 crc kubenswrapper[4867]: I1006 14:40:52.530280 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_299fc545-42f8-4889-8775-57b7aed64736/nova-metadata-metadata/0.log" Oct 06 14:40:52 crc kubenswrapper[4867]: I1006 14:40:52.652203 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k22cm_7478d336-9573-432c-8d73-f7396d652085/ovsdb-server-init/0.log" Oct 06 14:40:52 crc kubenswrapper[4867]: I1006 14:40:52.661982 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k22cm_7478d336-9573-432c-8d73-f7396d652085/ovsdb-server/0.log" Oct 06 14:40:52 crc kubenswrapper[4867]: I1006 14:40:52.845719 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k22cm_7478d336-9573-432c-8d73-f7396d652085/ovs-vswitchd/0.log" Oct 06 14:40:52 crc kubenswrapper[4867]: I1006 14:40:52.889522 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-tg8j4_68750dd5-11c8-4fee-853c-09b68df5aff8/ovn-controller/0.log" Oct 06 14:40:53 crc kubenswrapper[4867]: I1006 14:40:53.042954 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7vftx_4960b423-de56-4b83-a577-f551c82c2702/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:53 crc kubenswrapper[4867]: I1006 14:40:53.110084 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad/openstack-network-exporter/0.log" Oct 06 14:40:53 crc kubenswrapper[4867]: I1006 14:40:53.174327 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad/ovn-northd/0.log" Oct 06 14:40:53 crc kubenswrapper[4867]: I1006 14:40:53.331494 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b8d27ae1-8b6d-4a9d-b302-a354673be3be/openstack-network-exporter/0.log" Oct 06 14:40:53 crc kubenswrapper[4867]: I1006 14:40:53.332092 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b8d27ae1-8b6d-4a9d-b302-a354673be3be/ovsdbserver-nb/0.log" Oct 06 14:40:53 crc kubenswrapper[4867]: I1006 14:40:53.472411 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_18420b8b-345a-41e6-b753-6766143362a3/openstack-network-exporter/0.log" Oct 06 14:40:53 crc kubenswrapper[4867]: I1006 14:40:53.525549 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_18420b8b-345a-41e6-b753-6766143362a3/ovsdbserver-sb/0.log" Oct 06 14:40:54 crc kubenswrapper[4867]: I1006 14:40:54.039596 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7d54862f-97ef-4958-8b56-4f6f590fc7da/init-config-reloader/0.log" Oct 06 14:40:54 crc kubenswrapper[4867]: I1006 14:40:54.106531 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-594954fbc6-c2fc2_7ddb2a04-2d3f-4340-a512-8921427ba510/placement-api/0.log" Oct 06 14:40:54 crc kubenswrapper[4867]: I1006 14:40:54.157295 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-594954fbc6-c2fc2_7ddb2a04-2d3f-4340-a512-8921427ba510/placement-log/0.log" Oct 06 14:40:54 crc kubenswrapper[4867]: I1006 14:40:54.236348 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7d54862f-97ef-4958-8b56-4f6f590fc7da/init-config-reloader/0.log" Oct 06 14:40:54 crc kubenswrapper[4867]: I1006 14:40:54.276498 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7d54862f-97ef-4958-8b56-4f6f590fc7da/config-reloader/0.log" Oct 06 14:40:54 crc kubenswrapper[4867]: I1006 14:40:54.290119 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7d54862f-97ef-4958-8b56-4f6f590fc7da/prometheus/0.log" Oct 06 14:40:54 crc kubenswrapper[4867]: I1006 14:40:54.346901 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7d54862f-97ef-4958-8b56-4f6f590fc7da/thanos-sidecar/0.log" Oct 06 14:40:54 crc kubenswrapper[4867]: I1006 14:40:54.482668 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6669c79a-e288-4d00-8add-bffd6b33b8b9/setup-container/0.log" Oct 06 14:40:54 crc kubenswrapper[4867]: I1006 14:40:54.672423 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6669c79a-e288-4d00-8add-bffd6b33b8b9/rabbitmq/0.log" Oct 06 14:40:54 crc kubenswrapper[4867]: I1006 14:40:54.702509 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_4beec03b-3d57-4c36-a149-153bb022bd7a/setup-container/0.log" Oct 06 14:40:54 crc kubenswrapper[4867]: I1006 14:40:54.715981 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6669c79a-e288-4d00-8add-bffd6b33b8b9/setup-container/0.log" Oct 06 14:40:54 crc kubenswrapper[4867]: I1006 14:40:54.948412 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_4beec03b-3d57-4c36-a149-153bb022bd7a/setup-container/0.log" Oct 06 14:40:54 crc kubenswrapper[4867]: I1006 14:40:54.974948 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5d0a4a4a-9d75-4d2b-aeb8-1903093398d0/setup-container/0.log" Oct 06 14:40:54 crc kubenswrapper[4867]: I1006 14:40:54.990346 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_4beec03b-3d57-4c36-a149-153bb022bd7a/rabbitmq/0.log" Oct 06 14:40:55 crc kubenswrapper[4867]: I1006 14:40:55.258863 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5d0a4a4a-9d75-4d2b-aeb8-1903093398d0/setup-container/0.log" Oct 06 14:40:55 crc kubenswrapper[4867]: I1006 14:40:55.264329 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5d0a4a4a-9d75-4d2b-aeb8-1903093398d0/rabbitmq/0.log" Oct 06 14:40:55 crc kubenswrapper[4867]: I1006 14:40:55.314157 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w_5759403e-a3b6-4553-9e27-f471a616644f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:55 crc kubenswrapper[4867]: I1006 14:40:55.482232 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-p9xg6_05b64a8a-2fa5-4281-8e82-c27ff976b24f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:55 crc kubenswrapper[4867]: I1006 14:40:55.562441 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw_8f86cabb-0582-4b1c-993f-f9766defe823/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:55 crc kubenswrapper[4867]: I1006 14:40:55.741704 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rbjrx_8dec95c2-2ac5-4886-b1e1-ab333d4f5907/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:55 crc kubenswrapper[4867]: I1006 14:40:55.787421 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-smg6g_4b12f715-2704-4545-a627-39426cb3de93/ssh-known-hosts-edpm-deployment/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.064903 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7b666bc78f-zvlqd_a06d3199-78ee-4389-bbd2-2bc53c012c84/proxy-server/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.162096 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7b666bc78f-zvlqd_a06d3199-78ee-4389-bbd2-2bc53c012c84/proxy-httpd/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.294065 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-p9sjp_46037a5a-6fcb-48c6-854d-1f4e60534120/swift-ring-rebalance/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.324745 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/account-auditor/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.384614 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/account-reaper/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.499031 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/account-replicator/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.515698 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/account-server/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.518095 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/container-auditor/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.618526 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/container-replicator/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.731580 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/container-updater/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.742200 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/object-auditor/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.771802 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/container-server/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.812092 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/object-expirer/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.959821 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/object-replicator/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.966202 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/object-server/0.log" Oct 06 14:40:56 crc kubenswrapper[4867]: I1006 14:40:56.994222 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/object-updater/0.log" Oct 06 14:40:57 crc kubenswrapper[4867]: I1006 14:40:57.015631 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/rsync/0.log" Oct 06 14:40:57 crc kubenswrapper[4867]: I1006 14:40:57.144811 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/swift-recon-cron/0.log" Oct 06 14:40:57 crc kubenswrapper[4867]: I1006 14:40:57.211471 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t_a13c4977-6a03-4678-b394-0b33d74ee2a8/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:57 crc kubenswrapper[4867]: I1006 14:40:57.375787 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_43684055-87e6-4568-8a80-8019600aaeef/tempest-tests-tempest-tests-runner/0.log" Oct 06 14:40:57 crc kubenswrapper[4867]: I1006 14:40:57.469736 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9b73e25d-d1ba-4829-948f-bba412f56404/test-operator-logs-container/0.log" Oct 06 14:40:57 crc kubenswrapper[4867]: I1006 14:40:57.593322 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k_4c45ecf4-2135-407f-ab03-6c1571cd3f76/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:40:58 crc kubenswrapper[4867]: I1006 14:40:58.554051 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_5389fa15-6fc4-4154-9760-38f0653cb802/watcher-applier/0.log" Oct 06 14:40:59 crc kubenswrapper[4867]: I1006 14:40:59.198209 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_288a6591-36fc-453e-b41f-c0bed1da11b6/watcher-api-log/0.log" Oct 06 14:41:01 crc kubenswrapper[4867]: I1006 14:41:01.973620 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_886a11ab-54f5-45c1-a604-41203d080360/watcher-decision-engine/0.log" Oct 06 14:41:02 crc kubenswrapper[4867]: I1006 14:41:02.922864 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_288a6591-36fc-453e-b41f-c0bed1da11b6/watcher-api/0.log" Oct 06 14:41:30 crc kubenswrapper[4867]: I1006 14:41:30.180992 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lj4ht"] Oct 06 14:41:30 crc kubenswrapper[4867]: I1006 14:41:30.186679 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:30 crc kubenswrapper[4867]: I1006 14:41:30.194087 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lj4ht"] Oct 06 14:41:30 crc kubenswrapper[4867]: I1006 14:41:30.302405 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-utilities\") pod \"redhat-operators-lj4ht\" (UID: \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\") " pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:30 crc kubenswrapper[4867]: I1006 14:41:30.302577 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-catalog-content\") pod \"redhat-operators-lj4ht\" (UID: \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\") " pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:30 crc kubenswrapper[4867]: I1006 14:41:30.302726 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrkkq\" (UniqueName: \"kubernetes.io/projected/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-kube-api-access-wrkkq\") pod \"redhat-operators-lj4ht\" (UID: \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\") " pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:30 crc kubenswrapper[4867]: I1006 14:41:30.405079 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-utilities\") pod \"redhat-operators-lj4ht\" (UID: \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\") " pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:30 crc kubenswrapper[4867]: I1006 14:41:30.405270 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-catalog-content\") pod \"redhat-operators-lj4ht\" (UID: \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\") " pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:30 crc kubenswrapper[4867]: I1006 14:41:30.405416 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrkkq\" (UniqueName: \"kubernetes.io/projected/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-kube-api-access-wrkkq\") pod \"redhat-operators-lj4ht\" (UID: \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\") " pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:30 crc kubenswrapper[4867]: I1006 14:41:30.405796 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-catalog-content\") pod \"redhat-operators-lj4ht\" (UID: \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\") " pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:30 crc kubenswrapper[4867]: I1006 14:41:30.405972 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-utilities\") pod \"redhat-operators-lj4ht\" (UID: \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\") " pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:30 crc kubenswrapper[4867]: I1006 14:41:30.448195 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrkkq\" (UniqueName: \"kubernetes.io/projected/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-kube-api-access-wrkkq\") pod \"redhat-operators-lj4ht\" (UID: \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\") " pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:30 crc kubenswrapper[4867]: I1006 14:41:30.509713 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:31 crc kubenswrapper[4867]: I1006 14:41:31.068623 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lj4ht"] Oct 06 14:41:31 crc kubenswrapper[4867]: W1006 14:41:31.074038 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54f9e4b9_a358_4c4f_a556_744d62cd2b1c.slice/crio-507ce009de392d27859f189163c9fd800c1ecb26f66dc2398b5e86ab76ebebd4 WatchSource:0}: Error finding container 507ce009de392d27859f189163c9fd800c1ecb26f66dc2398b5e86ab76ebebd4: Status 404 returned error can't find the container with id 507ce009de392d27859f189163c9fd800c1ecb26f66dc2398b5e86ab76ebebd4 Oct 06 14:41:31 crc kubenswrapper[4867]: I1006 14:41:31.265225 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj4ht" event={"ID":"54f9e4b9-a358-4c4f-a556-744d62cd2b1c","Type":"ContainerStarted","Data":"507ce009de392d27859f189163c9fd800c1ecb26f66dc2398b5e86ab76ebebd4"} Oct 06 14:41:32 crc kubenswrapper[4867]: I1006 14:41:32.293001 4867 generic.go:334] "Generic (PLEG): container finished" podID="54f9e4b9-a358-4c4f-a556-744d62cd2b1c" containerID="b38ee2ade200e2fdadcb193143baec22fffcab966ff7ff78246881b080b74162" exitCode=0 Oct 06 14:41:32 crc kubenswrapper[4867]: I1006 14:41:32.293102 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj4ht" event={"ID":"54f9e4b9-a358-4c4f-a556-744d62cd2b1c","Type":"ContainerDied","Data":"b38ee2ade200e2fdadcb193143baec22fffcab966ff7ff78246881b080b74162"} Oct 06 14:41:33 crc kubenswrapper[4867]: I1006 14:41:33.323370 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj4ht" event={"ID":"54f9e4b9-a358-4c4f-a556-744d62cd2b1c","Type":"ContainerStarted","Data":"2c0d2fb769dbab0cbffd210a36489c682c99f7a89fb5193b5f06b5418135c7e3"} Oct 06 14:41:36 crc kubenswrapper[4867]: I1006 14:41:36.361043 4867 generic.go:334] "Generic (PLEG): container finished" podID="54f9e4b9-a358-4c4f-a556-744d62cd2b1c" containerID="2c0d2fb769dbab0cbffd210a36489c682c99f7a89fb5193b5f06b5418135c7e3" exitCode=0 Oct 06 14:41:36 crc kubenswrapper[4867]: I1006 14:41:36.361199 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj4ht" event={"ID":"54f9e4b9-a358-4c4f-a556-744d62cd2b1c","Type":"ContainerDied","Data":"2c0d2fb769dbab0cbffd210a36489c682c99f7a89fb5193b5f06b5418135c7e3"} Oct 06 14:41:37 crc kubenswrapper[4867]: I1006 14:41:37.371737 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj4ht" event={"ID":"54f9e4b9-a358-4c4f-a556-744d62cd2b1c","Type":"ContainerStarted","Data":"886b78b1c0eee6eef6459df7c7d4c46bd3329fe5a09d599696ee74b8161113f6"} Oct 06 14:41:37 crc kubenswrapper[4867]: I1006 14:41:37.395741 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lj4ht" podStartSLOduration=2.898969907 podStartE2EDuration="7.395719112s" podCreationTimestamp="2025-10-06 14:41:30 +0000 UTC" firstStartedPulling="2025-10-06 14:41:32.296803695 +0000 UTC m=+5871.754751879" lastFinishedPulling="2025-10-06 14:41:36.79355294 +0000 UTC m=+5876.251501084" observedRunningTime="2025-10-06 14:41:37.388445158 +0000 UTC m=+5876.846393332" watchObservedRunningTime="2025-10-06 14:41:37.395719112 +0000 UTC m=+5876.853667256" Oct 06 14:41:40 crc kubenswrapper[4867]: I1006 14:41:40.510471 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:40 crc kubenswrapper[4867]: I1006 14:41:40.510806 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:41 crc kubenswrapper[4867]: I1006 14:41:41.564232 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lj4ht" podUID="54f9e4b9-a358-4c4f-a556-744d62cd2b1c" containerName="registry-server" probeResult="failure" output=< Oct 06 14:41:41 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Oct 06 14:41:41 crc kubenswrapper[4867]: > Oct 06 14:41:44 crc kubenswrapper[4867]: I1006 14:41:44.449732 4867 generic.go:334] "Generic (PLEG): container finished" podID="b8565725-0d1b-49a0-a469-3b20311e5f7d" containerID="7b394c8e51551c7e87c17d42d0a568a6a6e23b64986c0fa2bef42449bf5e2302" exitCode=0 Oct 06 14:41:44 crc kubenswrapper[4867]: I1006 14:41:44.449821 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t7lp9/crc-debug-4blgh" event={"ID":"b8565725-0d1b-49a0-a469-3b20311e5f7d","Type":"ContainerDied","Data":"7b394c8e51551c7e87c17d42d0a568a6a6e23b64986c0fa2bef42449bf5e2302"} Oct 06 14:41:45 crc kubenswrapper[4867]: I1006 14:41:45.564070 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/crc-debug-4blgh" Oct 06 14:41:45 crc kubenswrapper[4867]: I1006 14:41:45.602439 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t7lp9/crc-debug-4blgh"] Oct 06 14:41:45 crc kubenswrapper[4867]: I1006 14:41:45.606648 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qgck\" (UniqueName: \"kubernetes.io/projected/b8565725-0d1b-49a0-a469-3b20311e5f7d-kube-api-access-7qgck\") pod \"b8565725-0d1b-49a0-a469-3b20311e5f7d\" (UID: \"b8565725-0d1b-49a0-a469-3b20311e5f7d\") " Oct 06 14:41:45 crc kubenswrapper[4867]: I1006 14:41:45.606712 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8565725-0d1b-49a0-a469-3b20311e5f7d-host\") pod \"b8565725-0d1b-49a0-a469-3b20311e5f7d\" (UID: \"b8565725-0d1b-49a0-a469-3b20311e5f7d\") " Oct 06 14:41:45 crc kubenswrapper[4867]: I1006 14:41:45.606952 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8565725-0d1b-49a0-a469-3b20311e5f7d-host" (OuterVolumeSpecName: "host") pod "b8565725-0d1b-49a0-a469-3b20311e5f7d" (UID: "b8565725-0d1b-49a0-a469-3b20311e5f7d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 14:41:45 crc kubenswrapper[4867]: I1006 14:41:45.607841 4867 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8565725-0d1b-49a0-a469-3b20311e5f7d-host\") on node \"crc\" DevicePath \"\"" Oct 06 14:41:45 crc kubenswrapper[4867]: I1006 14:41:45.610617 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t7lp9/crc-debug-4blgh"] Oct 06 14:41:45 crc kubenswrapper[4867]: I1006 14:41:45.618896 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8565725-0d1b-49a0-a469-3b20311e5f7d-kube-api-access-7qgck" (OuterVolumeSpecName: "kube-api-access-7qgck") pod "b8565725-0d1b-49a0-a469-3b20311e5f7d" (UID: "b8565725-0d1b-49a0-a469-3b20311e5f7d"). InnerVolumeSpecName "kube-api-access-7qgck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:41:45 crc kubenswrapper[4867]: I1006 14:41:45.709612 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qgck\" (UniqueName: \"kubernetes.io/projected/b8565725-0d1b-49a0-a469-3b20311e5f7d-kube-api-access-7qgck\") on node \"crc\" DevicePath \"\"" Oct 06 14:41:46 crc kubenswrapper[4867]: I1006 14:41:46.472212 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b83261b5c975d0fdc3487f50960784a9959f2e5e1ea3277f58cea22196c5cc2" Oct 06 14:41:46 crc kubenswrapper[4867]: I1006 14:41:46.472418 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/crc-debug-4blgh" Oct 06 14:41:46 crc kubenswrapper[4867]: I1006 14:41:46.778076 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t7lp9/crc-debug-bbsxf"] Oct 06 14:41:46 crc kubenswrapper[4867]: E1006 14:41:46.778485 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8565725-0d1b-49a0-a469-3b20311e5f7d" containerName="container-00" Oct 06 14:41:46 crc kubenswrapper[4867]: I1006 14:41:46.778498 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8565725-0d1b-49a0-a469-3b20311e5f7d" containerName="container-00" Oct 06 14:41:46 crc kubenswrapper[4867]: I1006 14:41:46.778700 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8565725-0d1b-49a0-a469-3b20311e5f7d" containerName="container-00" Oct 06 14:41:46 crc kubenswrapper[4867]: I1006 14:41:46.779356 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" Oct 06 14:41:46 crc kubenswrapper[4867]: I1006 14:41:46.836719 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/297d7dad-3588-4ad9-a391-9f677ce8b12a-host\") pod \"crc-debug-bbsxf\" (UID: \"297d7dad-3588-4ad9-a391-9f677ce8b12a\") " pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" Oct 06 14:41:46 crc kubenswrapper[4867]: I1006 14:41:46.836767 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtsnx\" (UniqueName: \"kubernetes.io/projected/297d7dad-3588-4ad9-a391-9f677ce8b12a-kube-api-access-qtsnx\") pod \"crc-debug-bbsxf\" (UID: \"297d7dad-3588-4ad9-a391-9f677ce8b12a\") " pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" Oct 06 14:41:46 crc kubenswrapper[4867]: I1006 14:41:46.939036 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/297d7dad-3588-4ad9-a391-9f677ce8b12a-host\") pod \"crc-debug-bbsxf\" (UID: \"297d7dad-3588-4ad9-a391-9f677ce8b12a\") " pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" Oct 06 14:41:46 crc kubenswrapper[4867]: I1006 14:41:46.939092 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtsnx\" (UniqueName: \"kubernetes.io/projected/297d7dad-3588-4ad9-a391-9f677ce8b12a-kube-api-access-qtsnx\") pod \"crc-debug-bbsxf\" (UID: \"297d7dad-3588-4ad9-a391-9f677ce8b12a\") " pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" Oct 06 14:41:46 crc kubenswrapper[4867]: I1006 14:41:46.939135 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/297d7dad-3588-4ad9-a391-9f677ce8b12a-host\") pod \"crc-debug-bbsxf\" (UID: \"297d7dad-3588-4ad9-a391-9f677ce8b12a\") " pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" Oct 06 14:41:46 crc kubenswrapper[4867]: I1006 14:41:46.961675 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtsnx\" (UniqueName: \"kubernetes.io/projected/297d7dad-3588-4ad9-a391-9f677ce8b12a-kube-api-access-qtsnx\") pod \"crc-debug-bbsxf\" (UID: \"297d7dad-3588-4ad9-a391-9f677ce8b12a\") " pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" Oct 06 14:41:47 crc kubenswrapper[4867]: I1006 14:41:47.100279 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" Oct 06 14:41:47 crc kubenswrapper[4867]: I1006 14:41:47.235937 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8565725-0d1b-49a0-a469-3b20311e5f7d" path="/var/lib/kubelet/pods/b8565725-0d1b-49a0-a469-3b20311e5f7d/volumes" Oct 06 14:41:47 crc kubenswrapper[4867]: I1006 14:41:47.483982 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" event={"ID":"297d7dad-3588-4ad9-a391-9f677ce8b12a","Type":"ContainerStarted","Data":"43999a7de2b8d328e85122160ce7c5a27b3e27e8d4b32b367b61e4ed7e5229c8"} Oct 06 14:41:47 crc kubenswrapper[4867]: I1006 14:41:47.484064 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" event={"ID":"297d7dad-3588-4ad9-a391-9f677ce8b12a","Type":"ContainerStarted","Data":"90c7e2891bb9cfbeba12c0fb79a39344f03e9f91c446713e1a0d5b66d2f83eed"} Oct 06 14:41:47 crc kubenswrapper[4867]: I1006 14:41:47.503607 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" podStartSLOduration=1.5035842019999999 podStartE2EDuration="1.503584202s" podCreationTimestamp="2025-10-06 14:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:41:47.497784696 +0000 UTC m=+5886.955732840" watchObservedRunningTime="2025-10-06 14:41:47.503584202 +0000 UTC m=+5886.961532346" Oct 06 14:41:48 crc kubenswrapper[4867]: I1006 14:41:48.498872 4867 generic.go:334] "Generic (PLEG): container finished" podID="297d7dad-3588-4ad9-a391-9f677ce8b12a" containerID="43999a7de2b8d328e85122160ce7c5a27b3e27e8d4b32b367b61e4ed7e5229c8" exitCode=0 Oct 06 14:41:48 crc kubenswrapper[4867]: I1006 14:41:48.498952 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" event={"ID":"297d7dad-3588-4ad9-a391-9f677ce8b12a","Type":"ContainerDied","Data":"43999a7de2b8d328e85122160ce7c5a27b3e27e8d4b32b367b61e4ed7e5229c8"} Oct 06 14:41:49 crc kubenswrapper[4867]: I1006 14:41:49.611236 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" Oct 06 14:41:49 crc kubenswrapper[4867]: I1006 14:41:49.703324 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtsnx\" (UniqueName: \"kubernetes.io/projected/297d7dad-3588-4ad9-a391-9f677ce8b12a-kube-api-access-qtsnx\") pod \"297d7dad-3588-4ad9-a391-9f677ce8b12a\" (UID: \"297d7dad-3588-4ad9-a391-9f677ce8b12a\") " Oct 06 14:41:49 crc kubenswrapper[4867]: I1006 14:41:49.703543 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/297d7dad-3588-4ad9-a391-9f677ce8b12a-host\") pod \"297d7dad-3588-4ad9-a391-9f677ce8b12a\" (UID: \"297d7dad-3588-4ad9-a391-9f677ce8b12a\") " Oct 06 14:41:49 crc kubenswrapper[4867]: I1006 14:41:49.703617 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/297d7dad-3588-4ad9-a391-9f677ce8b12a-host" (OuterVolumeSpecName: "host") pod "297d7dad-3588-4ad9-a391-9f677ce8b12a" (UID: "297d7dad-3588-4ad9-a391-9f677ce8b12a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 14:41:49 crc kubenswrapper[4867]: I1006 14:41:49.704551 4867 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/297d7dad-3588-4ad9-a391-9f677ce8b12a-host\") on node \"crc\" DevicePath \"\"" Oct 06 14:41:49 crc kubenswrapper[4867]: I1006 14:41:49.710150 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/297d7dad-3588-4ad9-a391-9f677ce8b12a-kube-api-access-qtsnx" (OuterVolumeSpecName: "kube-api-access-qtsnx") pod "297d7dad-3588-4ad9-a391-9f677ce8b12a" (UID: "297d7dad-3588-4ad9-a391-9f677ce8b12a"). InnerVolumeSpecName "kube-api-access-qtsnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:41:49 crc kubenswrapper[4867]: I1006 14:41:49.805969 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtsnx\" (UniqueName: \"kubernetes.io/projected/297d7dad-3588-4ad9-a391-9f677ce8b12a-kube-api-access-qtsnx\") on node \"crc\" DevicePath \"\"" Oct 06 14:41:50 crc kubenswrapper[4867]: I1006 14:41:50.520523 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" event={"ID":"297d7dad-3588-4ad9-a391-9f677ce8b12a","Type":"ContainerDied","Data":"90c7e2891bb9cfbeba12c0fb79a39344f03e9f91c446713e1a0d5b66d2f83eed"} Oct 06 14:41:50 crc kubenswrapper[4867]: I1006 14:41:50.520575 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90c7e2891bb9cfbeba12c0fb79a39344f03e9f91c446713e1a0d5b66d2f83eed" Oct 06 14:41:50 crc kubenswrapper[4867]: I1006 14:41:50.520599 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/crc-debug-bbsxf" Oct 06 14:41:50 crc kubenswrapper[4867]: I1006 14:41:50.568798 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:50 crc kubenswrapper[4867]: I1006 14:41:50.618772 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:50 crc kubenswrapper[4867]: I1006 14:41:50.811214 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lj4ht"] Oct 06 14:41:52 crc kubenswrapper[4867]: I1006 14:41:52.545168 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lj4ht" podUID="54f9e4b9-a358-4c4f-a556-744d62cd2b1c" containerName="registry-server" containerID="cri-o://886b78b1c0eee6eef6459df7c7d4c46bd3329fe5a09d599696ee74b8161113f6" gracePeriod=2 Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.057213 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.177569 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-utilities\") pod \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\" (UID: \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\") " Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.178986 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrkkq\" (UniqueName: \"kubernetes.io/projected/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-kube-api-access-wrkkq\") pod \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\" (UID: \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\") " Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.179985 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-catalog-content\") pod \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\" (UID: \"54f9e4b9-a358-4c4f-a556-744d62cd2b1c\") " Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.179987 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-utilities" (OuterVolumeSpecName: "utilities") pod "54f9e4b9-a358-4c4f-a556-744d62cd2b1c" (UID: "54f9e4b9-a358-4c4f-a556-744d62cd2b1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.181943 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.184766 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-kube-api-access-wrkkq" (OuterVolumeSpecName: "kube-api-access-wrkkq") pod "54f9e4b9-a358-4c4f-a556-744d62cd2b1c" (UID: "54f9e4b9-a358-4c4f-a556-744d62cd2b1c"). InnerVolumeSpecName "kube-api-access-wrkkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.283641 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54f9e4b9-a358-4c4f-a556-744d62cd2b1c" (UID: "54f9e4b9-a358-4c4f-a556-744d62cd2b1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.285668 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrkkq\" (UniqueName: \"kubernetes.io/projected/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-kube-api-access-wrkkq\") on node \"crc\" DevicePath \"\"" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.285741 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54f9e4b9-a358-4c4f-a556-744d62cd2b1c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.559031 4867 generic.go:334] "Generic (PLEG): container finished" podID="54f9e4b9-a358-4c4f-a556-744d62cd2b1c" containerID="886b78b1c0eee6eef6459df7c7d4c46bd3329fe5a09d599696ee74b8161113f6" exitCode=0 Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.559120 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj4ht" event={"ID":"54f9e4b9-a358-4c4f-a556-744d62cd2b1c","Type":"ContainerDied","Data":"886b78b1c0eee6eef6459df7c7d4c46bd3329fe5a09d599696ee74b8161113f6"} Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.559170 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lj4ht" event={"ID":"54f9e4b9-a358-4c4f-a556-744d62cd2b1c","Type":"ContainerDied","Data":"507ce009de392d27859f189163c9fd800c1ecb26f66dc2398b5e86ab76ebebd4"} Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.559199 4867 scope.go:117] "RemoveContainer" containerID="886b78b1c0eee6eef6459df7c7d4c46bd3329fe5a09d599696ee74b8161113f6" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.559734 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lj4ht" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.601377 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lj4ht"] Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.610191 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lj4ht"] Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.617309 4867 scope.go:117] "RemoveContainer" containerID="2c0d2fb769dbab0cbffd210a36489c682c99f7a89fb5193b5f06b5418135c7e3" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.641797 4867 scope.go:117] "RemoveContainer" containerID="b38ee2ade200e2fdadcb193143baec22fffcab966ff7ff78246881b080b74162" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.699551 4867 scope.go:117] "RemoveContainer" containerID="886b78b1c0eee6eef6459df7c7d4c46bd3329fe5a09d599696ee74b8161113f6" Oct 06 14:41:53 crc kubenswrapper[4867]: E1006 14:41:53.700583 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"886b78b1c0eee6eef6459df7c7d4c46bd3329fe5a09d599696ee74b8161113f6\": container with ID starting with 886b78b1c0eee6eef6459df7c7d4c46bd3329fe5a09d599696ee74b8161113f6 not found: ID does not exist" containerID="886b78b1c0eee6eef6459df7c7d4c46bd3329fe5a09d599696ee74b8161113f6" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.700641 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"886b78b1c0eee6eef6459df7c7d4c46bd3329fe5a09d599696ee74b8161113f6"} err="failed to get container status \"886b78b1c0eee6eef6459df7c7d4c46bd3329fe5a09d599696ee74b8161113f6\": rpc error: code = NotFound desc = could not find container \"886b78b1c0eee6eef6459df7c7d4c46bd3329fe5a09d599696ee74b8161113f6\": container with ID starting with 886b78b1c0eee6eef6459df7c7d4c46bd3329fe5a09d599696ee74b8161113f6 not found: ID does not exist" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.700672 4867 scope.go:117] "RemoveContainer" containerID="2c0d2fb769dbab0cbffd210a36489c682c99f7a89fb5193b5f06b5418135c7e3" Oct 06 14:41:53 crc kubenswrapper[4867]: E1006 14:41:53.701472 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0d2fb769dbab0cbffd210a36489c682c99f7a89fb5193b5f06b5418135c7e3\": container with ID starting with 2c0d2fb769dbab0cbffd210a36489c682c99f7a89fb5193b5f06b5418135c7e3 not found: ID does not exist" containerID="2c0d2fb769dbab0cbffd210a36489c682c99f7a89fb5193b5f06b5418135c7e3" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.701504 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0d2fb769dbab0cbffd210a36489c682c99f7a89fb5193b5f06b5418135c7e3"} err="failed to get container status \"2c0d2fb769dbab0cbffd210a36489c682c99f7a89fb5193b5f06b5418135c7e3\": rpc error: code = NotFound desc = could not find container \"2c0d2fb769dbab0cbffd210a36489c682c99f7a89fb5193b5f06b5418135c7e3\": container with ID starting with 2c0d2fb769dbab0cbffd210a36489c682c99f7a89fb5193b5f06b5418135c7e3 not found: ID does not exist" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.701520 4867 scope.go:117] "RemoveContainer" containerID="b38ee2ade200e2fdadcb193143baec22fffcab966ff7ff78246881b080b74162" Oct 06 14:41:53 crc kubenswrapper[4867]: E1006 14:41:53.701854 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b38ee2ade200e2fdadcb193143baec22fffcab966ff7ff78246881b080b74162\": container with ID starting with b38ee2ade200e2fdadcb193143baec22fffcab966ff7ff78246881b080b74162 not found: ID does not exist" containerID="b38ee2ade200e2fdadcb193143baec22fffcab966ff7ff78246881b080b74162" Oct 06 14:41:53 crc kubenswrapper[4867]: I1006 14:41:53.701879 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b38ee2ade200e2fdadcb193143baec22fffcab966ff7ff78246881b080b74162"} err="failed to get container status \"b38ee2ade200e2fdadcb193143baec22fffcab966ff7ff78246881b080b74162\": rpc error: code = NotFound desc = could not find container \"b38ee2ade200e2fdadcb193143baec22fffcab966ff7ff78246881b080b74162\": container with ID starting with b38ee2ade200e2fdadcb193143baec22fffcab966ff7ff78246881b080b74162 not found: ID does not exist" Oct 06 14:41:55 crc kubenswrapper[4867]: I1006 14:41:55.239987 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54f9e4b9-a358-4c4f-a556-744d62cd2b1c" path="/var/lib/kubelet/pods/54f9e4b9-a358-4c4f-a556-744d62cd2b1c/volumes" Oct 06 14:41:58 crc kubenswrapper[4867]: I1006 14:41:58.400974 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t7lp9/crc-debug-bbsxf"] Oct 06 14:41:58 crc kubenswrapper[4867]: I1006 14:41:58.409246 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t7lp9/crc-debug-bbsxf"] Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.231746 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="297d7dad-3588-4ad9-a391-9f677ce8b12a" path="/var/lib/kubelet/pods/297d7dad-3588-4ad9-a391-9f677ce8b12a/volumes" Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.753574 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t7lp9/crc-debug-lpssz"] Oct 06 14:41:59 crc kubenswrapper[4867]: E1006 14:41:59.753983 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="297d7dad-3588-4ad9-a391-9f677ce8b12a" containerName="container-00" Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.753996 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="297d7dad-3588-4ad9-a391-9f677ce8b12a" containerName="container-00" Oct 06 14:41:59 crc kubenswrapper[4867]: E1006 14:41:59.754020 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54f9e4b9-a358-4c4f-a556-744d62cd2b1c" containerName="extract-utilities" Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.754027 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f9e4b9-a358-4c4f-a556-744d62cd2b1c" containerName="extract-utilities" Oct 06 14:41:59 crc kubenswrapper[4867]: E1006 14:41:59.754039 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54f9e4b9-a358-4c4f-a556-744d62cd2b1c" containerName="registry-server" Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.754046 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f9e4b9-a358-4c4f-a556-744d62cd2b1c" containerName="registry-server" Oct 06 14:41:59 crc kubenswrapper[4867]: E1006 14:41:59.754059 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54f9e4b9-a358-4c4f-a556-744d62cd2b1c" containerName="extract-content" Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.754065 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f9e4b9-a358-4c4f-a556-744d62cd2b1c" containerName="extract-content" Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.754328 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="297d7dad-3588-4ad9-a391-9f677ce8b12a" containerName="container-00" Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.754378 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="54f9e4b9-a358-4c4f-a556-744d62cd2b1c" containerName="registry-server" Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.755275 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/crc-debug-lpssz" Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.828990 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm4td\" (UniqueName: \"kubernetes.io/projected/3320dba0-67d6-49d0-85b8-f2510d40fa2e-kube-api-access-cm4td\") pod \"crc-debug-lpssz\" (UID: \"3320dba0-67d6-49d0-85b8-f2510d40fa2e\") " pod="openshift-must-gather-t7lp9/crc-debug-lpssz" Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.829101 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3320dba0-67d6-49d0-85b8-f2510d40fa2e-host\") pod \"crc-debug-lpssz\" (UID: \"3320dba0-67d6-49d0-85b8-f2510d40fa2e\") " pod="openshift-must-gather-t7lp9/crc-debug-lpssz" Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.931523 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm4td\" (UniqueName: \"kubernetes.io/projected/3320dba0-67d6-49d0-85b8-f2510d40fa2e-kube-api-access-cm4td\") pod \"crc-debug-lpssz\" (UID: \"3320dba0-67d6-49d0-85b8-f2510d40fa2e\") " pod="openshift-must-gather-t7lp9/crc-debug-lpssz" Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.931669 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3320dba0-67d6-49d0-85b8-f2510d40fa2e-host\") pod \"crc-debug-lpssz\" (UID: \"3320dba0-67d6-49d0-85b8-f2510d40fa2e\") " pod="openshift-must-gather-t7lp9/crc-debug-lpssz" Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.931840 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3320dba0-67d6-49d0-85b8-f2510d40fa2e-host\") pod \"crc-debug-lpssz\" (UID: \"3320dba0-67d6-49d0-85b8-f2510d40fa2e\") " pod="openshift-must-gather-t7lp9/crc-debug-lpssz" Oct 06 14:41:59 crc kubenswrapper[4867]: I1006 14:41:59.958968 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm4td\" (UniqueName: \"kubernetes.io/projected/3320dba0-67d6-49d0-85b8-f2510d40fa2e-kube-api-access-cm4td\") pod \"crc-debug-lpssz\" (UID: \"3320dba0-67d6-49d0-85b8-f2510d40fa2e\") " pod="openshift-must-gather-t7lp9/crc-debug-lpssz" Oct 06 14:42:00 crc kubenswrapper[4867]: I1006 14:42:00.076838 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/crc-debug-lpssz" Oct 06 14:42:00 crc kubenswrapper[4867]: I1006 14:42:00.654343 4867 generic.go:334] "Generic (PLEG): container finished" podID="3320dba0-67d6-49d0-85b8-f2510d40fa2e" containerID="961cb73960b5115a31e5bd25572a6c76cd1b87b6c3de4e2cc3c0c012b28a5a26" exitCode=0 Oct 06 14:42:00 crc kubenswrapper[4867]: I1006 14:42:00.654531 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t7lp9/crc-debug-lpssz" event={"ID":"3320dba0-67d6-49d0-85b8-f2510d40fa2e","Type":"ContainerDied","Data":"961cb73960b5115a31e5bd25572a6c76cd1b87b6c3de4e2cc3c0c012b28a5a26"} Oct 06 14:42:00 crc kubenswrapper[4867]: I1006 14:42:00.654740 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t7lp9/crc-debug-lpssz" event={"ID":"3320dba0-67d6-49d0-85b8-f2510d40fa2e","Type":"ContainerStarted","Data":"1ffe2f0866e1882a1740fcea2e906011a7fa506dd8fed156b509abc2a7c77aca"} Oct 06 14:42:00 crc kubenswrapper[4867]: I1006 14:42:00.691199 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t7lp9/crc-debug-lpssz"] Oct 06 14:42:00 crc kubenswrapper[4867]: I1006 14:42:00.698957 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t7lp9/crc-debug-lpssz"] Oct 06 14:42:01 crc kubenswrapper[4867]: I1006 14:42:01.768409 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/crc-debug-lpssz" Oct 06 14:42:01 crc kubenswrapper[4867]: I1006 14:42:01.877476 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm4td\" (UniqueName: \"kubernetes.io/projected/3320dba0-67d6-49d0-85b8-f2510d40fa2e-kube-api-access-cm4td\") pod \"3320dba0-67d6-49d0-85b8-f2510d40fa2e\" (UID: \"3320dba0-67d6-49d0-85b8-f2510d40fa2e\") " Oct 06 14:42:01 crc kubenswrapper[4867]: I1006 14:42:01.878852 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3320dba0-67d6-49d0-85b8-f2510d40fa2e-host\") pod \"3320dba0-67d6-49d0-85b8-f2510d40fa2e\" (UID: \"3320dba0-67d6-49d0-85b8-f2510d40fa2e\") " Oct 06 14:42:01 crc kubenswrapper[4867]: I1006 14:42:01.878948 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3320dba0-67d6-49d0-85b8-f2510d40fa2e-host" (OuterVolumeSpecName: "host") pod "3320dba0-67d6-49d0-85b8-f2510d40fa2e" (UID: "3320dba0-67d6-49d0-85b8-f2510d40fa2e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 14:42:01 crc kubenswrapper[4867]: I1006 14:42:01.879524 4867 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3320dba0-67d6-49d0-85b8-f2510d40fa2e-host\") on node \"crc\" DevicePath \"\"" Oct 06 14:42:01 crc kubenswrapper[4867]: I1006 14:42:01.884029 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3320dba0-67d6-49d0-85b8-f2510d40fa2e-kube-api-access-cm4td" (OuterVolumeSpecName: "kube-api-access-cm4td") pod "3320dba0-67d6-49d0-85b8-f2510d40fa2e" (UID: "3320dba0-67d6-49d0-85b8-f2510d40fa2e"). InnerVolumeSpecName "kube-api-access-cm4td". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:42:01 crc kubenswrapper[4867]: I1006 14:42:01.981978 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm4td\" (UniqueName: \"kubernetes.io/projected/3320dba0-67d6-49d0-85b8-f2510d40fa2e-kube-api-access-cm4td\") on node \"crc\" DevicePath \"\"" Oct 06 14:42:02 crc kubenswrapper[4867]: I1006 14:42:02.493536 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b_b09c33a7-5fbd-4594-bebf-bcb9a5519e89/util/0.log" Oct 06 14:42:02 crc kubenswrapper[4867]: I1006 14:42:02.682156 4867 scope.go:117] "RemoveContainer" containerID="961cb73960b5115a31e5bd25572a6c76cd1b87b6c3de4e2cc3c0c012b28a5a26" Oct 06 14:42:02 crc kubenswrapper[4867]: I1006 14:42:02.682211 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/crc-debug-lpssz" Oct 06 14:42:02 crc kubenswrapper[4867]: I1006 14:42:02.766865 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b_b09c33a7-5fbd-4594-bebf-bcb9a5519e89/pull/0.log" Oct 06 14:42:02 crc kubenswrapper[4867]: I1006 14:42:02.775231 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b_b09c33a7-5fbd-4594-bebf-bcb9a5519e89/util/0.log" Oct 06 14:42:02 crc kubenswrapper[4867]: I1006 14:42:02.797334 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b_b09c33a7-5fbd-4594-bebf-bcb9a5519e89/pull/0.log" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.008967 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b_b09c33a7-5fbd-4594-bebf-bcb9a5519e89/util/0.log" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.011041 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b_b09c33a7-5fbd-4594-bebf-bcb9a5519e89/pull/0.log" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.024833 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b_b09c33a7-5fbd-4594-bebf-bcb9a5519e89/extract/0.log" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.241001 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3320dba0-67d6-49d0-85b8-f2510d40fa2e" path="/var/lib/kubelet/pods/3320dba0-67d6-49d0-85b8-f2510d40fa2e/volumes" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.256627 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-4tgfn_c0e4bb9c-5d2f-4a0f-9cb8-9133336c2536/manager/0.log" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.259505 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-4tgfn_c0e4bb9c-5d2f-4a0f-9cb8-9133336c2536/kube-rbac-proxy/0.log" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.299559 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-v72nf_3c3a38a7-d3a0-4c01-aae9-645d5dada80f/kube-rbac-proxy/0.log" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.488152 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-v72nf_3c3a38a7-d3a0-4c01-aae9-645d5dada80f/manager/0.log" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.504963 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-s4qrw_3b46e0ea-7a30-45ab-99cc-d36efd3fc75e/kube-rbac-proxy/0.log" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.507475 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-s4qrw_3b46e0ea-7a30-45ab-99cc-d36efd3fc75e/manager/0.log" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.695164 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-84drf_7050df56-39f0-4962-878b-7e9c498d86d4/kube-rbac-proxy/0.log" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.786276 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-84drf_7050df56-39f0-4962-878b-7e9c498d86d4/manager/0.log" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.925305 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-jxt5n_5d14ff34-79c1-467d-99b0-35202d1650bb/manager/0.log" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.929385 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-jxt5n_5d14ff34-79c1-467d-99b0-35202d1650bb/kube-rbac-proxy/0.log" Oct 06 14:42:03 crc kubenswrapper[4867]: I1006 14:42:03.979344 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-v222m_901a13c6-49ea-4126-8b2d-7c7901720f05/kube-rbac-proxy/0.log" Oct 06 14:42:04 crc kubenswrapper[4867]: I1006 14:42:04.234466 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-v222m_901a13c6-49ea-4126-8b2d-7c7901720f05/manager/0.log" Oct 06 14:42:04 crc kubenswrapper[4867]: I1006 14:42:04.375990 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-n64zf_311ba4cb-158b-41f4-ada4-4fed1c0f2ede/kube-rbac-proxy/0.log" Oct 06 14:42:04 crc kubenswrapper[4867]: I1006 14:42:04.561467 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-qhnpp_92cf840d-e92d-4212-8d63-2d623040ca46/kube-rbac-proxy/0.log" Oct 06 14:42:04 crc kubenswrapper[4867]: I1006 14:42:04.580931 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-n64zf_311ba4cb-158b-41f4-ada4-4fed1c0f2ede/manager/0.log" Oct 06 14:42:04 crc kubenswrapper[4867]: I1006 14:42:04.650228 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-qhnpp_92cf840d-e92d-4212-8d63-2d623040ca46/manager/0.log" Oct 06 14:42:04 crc kubenswrapper[4867]: I1006 14:42:04.801031 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-fwbwb_21147e7d-1dd6-4a90-ab7a-f923f014a281/kube-rbac-proxy/0.log" Oct 06 14:42:04 crc kubenswrapper[4867]: I1006 14:42:04.889726 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-fwbwb_21147e7d-1dd6-4a90-ab7a-f923f014a281/manager/0.log" Oct 06 14:42:04 crc kubenswrapper[4867]: I1006 14:42:04.969811 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-njdr6_831b6b11-3e22-4ae1-aa26-1ccb9a6bacb7/kube-rbac-proxy/0.log" Oct 06 14:42:05 crc kubenswrapper[4867]: I1006 14:42:05.077557 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-njdr6_831b6b11-3e22-4ae1-aa26-1ccb9a6bacb7/manager/0.log" Oct 06 14:42:05 crc kubenswrapper[4867]: I1006 14:42:05.204767 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg_6c53454b-e984-4366-8bd1-3c4eb10fb1c8/kube-rbac-proxy/0.log" Oct 06 14:42:05 crc kubenswrapper[4867]: I1006 14:42:05.302895 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg_6c53454b-e984-4366-8bd1-3c4eb10fb1c8/manager/0.log" Oct 06 14:42:05 crc kubenswrapper[4867]: I1006 14:42:05.498824 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-7jwqc_3d2faf90-2410-459e-a8a3-668296923f2e/kube-rbac-proxy/0.log" Oct 06 14:42:05 crc kubenswrapper[4867]: I1006 14:42:05.537566 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-7jwqc_3d2faf90-2410-459e-a8a3-668296923f2e/manager/0.log" Oct 06 14:42:05 crc kubenswrapper[4867]: I1006 14:42:05.652691 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-kpx9k_15792c9d-8f60-4b13-8623-55c9a6a7319b/kube-rbac-proxy/0.log" Oct 06 14:42:05 crc kubenswrapper[4867]: I1006 14:42:05.896080 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-kpx9k_15792c9d-8f60-4b13-8623-55c9a6a7319b/manager/0.log" Oct 06 14:42:05 crc kubenswrapper[4867]: I1006 14:42:05.991209 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-v6b9l_95e501d6-fddf-4baa-befd-25c5c5f3303e/kube-rbac-proxy/0.log" Oct 06 14:42:05 crc kubenswrapper[4867]: I1006 14:42:05.992818 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-v6b9l_95e501d6-fddf-4baa-befd-25c5c5f3303e/manager/0.log" Oct 06 14:42:06 crc kubenswrapper[4867]: I1006 14:42:06.191601 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt_dbe49bb4-18db-473e-b57c-2047bbbe2405/kube-rbac-proxy/0.log" Oct 06 14:42:06 crc kubenswrapper[4867]: I1006 14:42:06.263018 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt_dbe49bb4-18db-473e-b57c-2047bbbe2405/manager/0.log" Oct 06 14:42:06 crc kubenswrapper[4867]: I1006 14:42:06.297354 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-66dbf6f685-4srz5_94790623-543f-45ee-9579-6e837ce82cd8/kube-rbac-proxy/0.log" Oct 06 14:42:06 crc kubenswrapper[4867]: I1006 14:42:06.510598 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6c5b974dc6-zhns8_2ed7554d-3165-42b7-b7ac-6ad1b620e825/kube-rbac-proxy/0.log" Oct 06 14:42:06 crc kubenswrapper[4867]: I1006 14:42:06.735585 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6c5b974dc6-zhns8_2ed7554d-3165-42b7-b7ac-6ad1b620e825/operator/0.log" Oct 06 14:42:06 crc kubenswrapper[4867]: I1006 14:42:06.739833 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7ptcs_160e7b7c-4f2e-4dba-99a5-35c4d3d9868d/registry-server/0.log" Oct 06 14:42:06 crc kubenswrapper[4867]: I1006 14:42:06.851471 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-4lsh6_efcff7d5-4481-45ea-b693-ebc63e9f1458/kube-rbac-proxy/0.log" Oct 06 14:42:07 crc kubenswrapper[4867]: I1006 14:42:07.034405 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-922cb_05580142-d01c-470a-afcb-da956c1f6d36/kube-rbac-proxy/0.log" Oct 06 14:42:07 crc kubenswrapper[4867]: I1006 14:42:07.097299 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-4lsh6_efcff7d5-4481-45ea-b693-ebc63e9f1458/manager/0.log" Oct 06 14:42:07 crc kubenswrapper[4867]: I1006 14:42:07.151837 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-922cb_05580142-d01c-470a-afcb-da956c1f6d36/manager/0.log" Oct 06 14:42:07 crc kubenswrapper[4867]: I1006 14:42:07.415452 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp_bdb84464-dbf1-4dfc-9a87-b3dde7d0fcc3/operator/0.log" Oct 06 14:42:07 crc kubenswrapper[4867]: I1006 14:42:07.454855 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-48lgc_937696b7-f234-4e2e-97b3-9ef0f2bf0a90/kube-rbac-proxy/0.log" Oct 06 14:42:07 crc kubenswrapper[4867]: I1006 14:42:07.578210 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-48lgc_937696b7-f234-4e2e-97b3-9ef0f2bf0a90/manager/0.log" Oct 06 14:42:07 crc kubenswrapper[4867]: I1006 14:42:07.712853 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-wrrdj_d7b781c6-8500-43b4-884d-e67aadad8518/kube-rbac-proxy/0.log" Oct 06 14:42:07 crc kubenswrapper[4867]: I1006 14:42:07.842133 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-66dbf6f685-4srz5_94790623-543f-45ee-9579-6e837ce82cd8/manager/0.log" Oct 06 14:42:07 crc kubenswrapper[4867]: I1006 14:42:07.987097 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-tvpx8_9d369f1b-62ae-4b24-8287-fd62b21122ce/kube-rbac-proxy/0.log" Oct 06 14:42:08 crc kubenswrapper[4867]: I1006 14:42:08.035654 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-tvpx8_9d369f1b-62ae-4b24-8287-fd62b21122ce/manager/0.log" Oct 06 14:42:08 crc kubenswrapper[4867]: I1006 14:42:08.045143 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-wrrdj_d7b781c6-8500-43b4-884d-e67aadad8518/manager/0.log" Oct 06 14:42:08 crc kubenswrapper[4867]: I1006 14:42:08.223987 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-55dcdc7cc-z7lp5_b099322d-539c-4c48-9344-62e1fec437ab/kube-rbac-proxy/0.log" Oct 06 14:42:08 crc kubenswrapper[4867]: I1006 14:42:08.261539 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-55dcdc7cc-z7lp5_b099322d-539c-4c48-9344-62e1fec437ab/manager/0.log" Oct 06 14:42:24 crc kubenswrapper[4867]: I1006 14:42:24.931592 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dclxp_1204892b-a86d-4b14-9aca-1fcbd64c9cd2/control-plane-machine-set-operator/0.log" Oct 06 14:42:25 crc kubenswrapper[4867]: I1006 14:42:25.148393 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8lg7n_ed7648e1-d992-4263-9117-e50cd88a66a9/kube-rbac-proxy/0.log" Oct 06 14:42:25 crc kubenswrapper[4867]: I1006 14:42:25.242685 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8lg7n_ed7648e1-d992-4263-9117-e50cd88a66a9/machine-api-operator/0.log" Oct 06 14:42:37 crc kubenswrapper[4867]: I1006 14:42:37.135377 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-tlbns_ed16aef2-69b2-443d-8d5d-c2122dd5b373/cert-manager-controller/0.log" Oct 06 14:42:37 crc kubenswrapper[4867]: I1006 14:42:37.333100 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-h95bg_29b819dc-d3f7-449d-812a-9a76c1d02046/cert-manager-cainjector/0.log" Oct 06 14:42:37 crc kubenswrapper[4867]: I1006 14:42:37.360970 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-rxv42_e5b18647-65b8-4ed4-bf88-542c6c583588/cert-manager-webhook/0.log" Oct 06 14:42:42 crc kubenswrapper[4867]: I1006 14:42:42.873885 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:42:42 crc kubenswrapper[4867]: I1006 14:42:42.874721 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:42:49 crc kubenswrapper[4867]: I1006 14:42:49.089351 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-trjgk_fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83/nmstate-console-plugin/0.log" Oct 06 14:42:49 crc kubenswrapper[4867]: I1006 14:42:49.261561 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-wvv72_e19fdddd-1727-4c4b-985f-7548c278b0ca/nmstate-handler/0.log" Oct 06 14:42:49 crc kubenswrapper[4867]: I1006 14:42:49.323011 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vdksl_99a4464a-a11f-4a4e-86ae-43a9a76b060a/nmstate-metrics/0.log" Oct 06 14:42:49 crc kubenswrapper[4867]: I1006 14:42:49.339197 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vdksl_99a4464a-a11f-4a4e-86ae-43a9a76b060a/kube-rbac-proxy/0.log" Oct 06 14:42:49 crc kubenswrapper[4867]: I1006 14:42:49.522702 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-b6qnd_1bed039e-de7f-49b2-b0fe-47e8bc055e8d/nmstate-operator/0.log" Oct 06 14:42:49 crc kubenswrapper[4867]: I1006 14:42:49.607908 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-fm444_5802445f-947f-4d52-b1f3-91f404ef0088/nmstate-webhook/0.log" Oct 06 14:43:02 crc kubenswrapper[4867]: I1006 14:43:02.930614 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-qs5gj_39dc72e9-c1d5-4257-b8ea-248aaed554e5/kube-rbac-proxy/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.138455 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-qs5gj_39dc72e9-c1d5-4257-b8ea-248aaed554e5/controller/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.179716 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-frr-files/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.335880 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-reloader/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.386789 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-frr-files/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.394455 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-metrics/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.411674 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-reloader/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.528508 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-frr-files/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.598778 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-metrics/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.605116 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-metrics/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.612770 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-reloader/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.788959 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-frr-files/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.791410 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-reloader/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.820203 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-metrics/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.843578 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/controller/0.log" Oct 06 14:43:03 crc kubenswrapper[4867]: I1006 14:43:03.965144 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/frr-metrics/0.log" Oct 06 14:43:04 crc kubenswrapper[4867]: I1006 14:43:04.084563 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/kube-rbac-proxy-frr/0.log" Oct 06 14:43:04 crc kubenswrapper[4867]: I1006 14:43:04.091732 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/kube-rbac-proxy/0.log" Oct 06 14:43:04 crc kubenswrapper[4867]: I1006 14:43:04.208189 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/reloader/0.log" Oct 06 14:43:04 crc kubenswrapper[4867]: I1006 14:43:04.365229 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-xdc52_bab9da54-1204-49fd-af69-b48a1542d2e7/frr-k8s-webhook-server/0.log" Oct 06 14:43:04 crc kubenswrapper[4867]: I1006 14:43:04.534373 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5487d99769-x5czz_43844a7c-24fd-49b1-9860-6b4a63fc136a/manager/0.log" Oct 06 14:43:04 crc kubenswrapper[4867]: I1006 14:43:04.672091 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-9d5469fbf-r6fln_fb30a785-833d-47ee-be7f-5235fbfc826c/webhook-server/0.log" Oct 06 14:43:04 crc kubenswrapper[4867]: I1006 14:43:04.907056 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xn6jt_e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb/kube-rbac-proxy/0.log" Oct 06 14:43:05 crc kubenswrapper[4867]: I1006 14:43:05.494116 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xn6jt_e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb/speaker/0.log" Oct 06 14:43:05 crc kubenswrapper[4867]: I1006 14:43:05.788046 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/frr/0.log" Oct 06 14:43:12 crc kubenswrapper[4867]: I1006 14:43:12.873892 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:43:12 crc kubenswrapper[4867]: I1006 14:43:12.874504 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:43:16 crc kubenswrapper[4867]: I1006 14:43:16.485603 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l_4ab89d13-c239-4d47-aa11-68d1ea20e6b1/util/0.log" Oct 06 14:43:16 crc kubenswrapper[4867]: I1006 14:43:16.664952 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l_4ab89d13-c239-4d47-aa11-68d1ea20e6b1/pull/0.log" Oct 06 14:43:16 crc kubenswrapper[4867]: I1006 14:43:16.683878 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l_4ab89d13-c239-4d47-aa11-68d1ea20e6b1/util/0.log" Oct 06 14:43:16 crc kubenswrapper[4867]: I1006 14:43:16.692809 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l_4ab89d13-c239-4d47-aa11-68d1ea20e6b1/pull/0.log" Oct 06 14:43:16 crc kubenswrapper[4867]: I1006 14:43:16.856616 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l_4ab89d13-c239-4d47-aa11-68d1ea20e6b1/util/0.log" Oct 06 14:43:16 crc kubenswrapper[4867]: I1006 14:43:16.858818 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l_4ab89d13-c239-4d47-aa11-68d1ea20e6b1/extract/0.log" Oct 06 14:43:16 crc kubenswrapper[4867]: I1006 14:43:16.888631 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l_4ab89d13-c239-4d47-aa11-68d1ea20e6b1/pull/0.log" Oct 06 14:43:17 crc kubenswrapper[4867]: I1006 14:43:17.030831 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2_0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be/util/0.log" Oct 06 14:43:17 crc kubenswrapper[4867]: I1006 14:43:17.202432 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2_0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be/pull/0.log" Oct 06 14:43:17 crc kubenswrapper[4867]: I1006 14:43:17.206360 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2_0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be/util/0.log" Oct 06 14:43:17 crc kubenswrapper[4867]: I1006 14:43:17.229911 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2_0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be/pull/0.log" Oct 06 14:43:17 crc kubenswrapper[4867]: I1006 14:43:17.344974 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2_0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be/util/0.log" Oct 06 14:43:17 crc kubenswrapper[4867]: I1006 14:43:17.437650 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2_0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be/pull/0.log" Oct 06 14:43:17 crc kubenswrapper[4867]: I1006 14:43:17.438451 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2_0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be/extract/0.log" Oct 06 14:43:17 crc kubenswrapper[4867]: I1006 14:43:17.542366 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f85vr_f466559d-88ca-40ca-aa83-7dcf8cf436f4/extract-utilities/0.log" Oct 06 14:43:17 crc kubenswrapper[4867]: I1006 14:43:17.767138 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f85vr_f466559d-88ca-40ca-aa83-7dcf8cf436f4/extract-content/0.log" Oct 06 14:43:17 crc kubenswrapper[4867]: I1006 14:43:17.776205 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f85vr_f466559d-88ca-40ca-aa83-7dcf8cf436f4/extract-content/0.log" Oct 06 14:43:17 crc kubenswrapper[4867]: I1006 14:43:17.792353 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f85vr_f466559d-88ca-40ca-aa83-7dcf8cf436f4/extract-utilities/0.log" Oct 06 14:43:18 crc kubenswrapper[4867]: I1006 14:43:18.007713 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f85vr_f466559d-88ca-40ca-aa83-7dcf8cf436f4/extract-content/0.log" Oct 06 14:43:18 crc kubenswrapper[4867]: I1006 14:43:18.029190 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f85vr_f466559d-88ca-40ca-aa83-7dcf8cf436f4/extract-utilities/0.log" Oct 06 14:43:18 crc kubenswrapper[4867]: I1006 14:43:18.207732 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-29nnd_977ff4f6-ca6e-4ba8-8db3-3eb828510a13/extract-utilities/0.log" Oct 06 14:43:18 crc kubenswrapper[4867]: I1006 14:43:18.300875 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f85vr_f466559d-88ca-40ca-aa83-7dcf8cf436f4/registry-server/0.log" Oct 06 14:43:18 crc kubenswrapper[4867]: I1006 14:43:18.437734 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-29nnd_977ff4f6-ca6e-4ba8-8db3-3eb828510a13/extract-utilities/0.log" Oct 06 14:43:18 crc kubenswrapper[4867]: I1006 14:43:18.470036 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-29nnd_977ff4f6-ca6e-4ba8-8db3-3eb828510a13/extract-content/0.log" Oct 06 14:43:18 crc kubenswrapper[4867]: I1006 14:43:18.480427 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-29nnd_977ff4f6-ca6e-4ba8-8db3-3eb828510a13/extract-content/0.log" Oct 06 14:43:18 crc kubenswrapper[4867]: I1006 14:43:18.631116 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-29nnd_977ff4f6-ca6e-4ba8-8db3-3eb828510a13/extract-content/0.log" Oct 06 14:43:18 crc kubenswrapper[4867]: I1006 14:43:18.647929 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-29nnd_977ff4f6-ca6e-4ba8-8db3-3eb828510a13/extract-utilities/0.log" Oct 06 14:43:18 crc kubenswrapper[4867]: I1006 14:43:18.848475 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc_2548a3ec-5354-4309-a045-1a29253ad94b/util/0.log" Oct 06 14:43:19 crc kubenswrapper[4867]: I1006 14:43:19.115543 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc_2548a3ec-5354-4309-a045-1a29253ad94b/pull/0.log" Oct 06 14:43:19 crc kubenswrapper[4867]: I1006 14:43:19.145365 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc_2548a3ec-5354-4309-a045-1a29253ad94b/util/0.log" Oct 06 14:43:19 crc kubenswrapper[4867]: I1006 14:43:19.190317 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc_2548a3ec-5354-4309-a045-1a29253ad94b/pull/0.log" Oct 06 14:43:19 crc kubenswrapper[4867]: I1006 14:43:19.434001 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc_2548a3ec-5354-4309-a045-1a29253ad94b/pull/0.log" Oct 06 14:43:19 crc kubenswrapper[4867]: I1006 14:43:19.443643 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc_2548a3ec-5354-4309-a045-1a29253ad94b/extract/0.log" Oct 06 14:43:19 crc kubenswrapper[4867]: I1006 14:43:19.489973 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc_2548a3ec-5354-4309-a045-1a29253ad94b/util/0.log" Oct 06 14:43:19 crc kubenswrapper[4867]: I1006 14:43:19.717097 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-29nnd_977ff4f6-ca6e-4ba8-8db3-3eb828510a13/registry-server/0.log" Oct 06 14:43:19 crc kubenswrapper[4867]: I1006 14:43:19.795438 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rdmr9_298bf2ee-baaf-4fbb-a107-d712667f246e/marketplace-operator/0.log" Oct 06 14:43:19 crc kubenswrapper[4867]: I1006 14:43:19.914018 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-khr9q_a3d6b459-d981-4cbe-8658-27860d930c81/extract-utilities/0.log" Oct 06 14:43:20 crc kubenswrapper[4867]: I1006 14:43:20.100473 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-khr9q_a3d6b459-d981-4cbe-8658-27860d930c81/extract-content/0.log" Oct 06 14:43:20 crc kubenswrapper[4867]: I1006 14:43:20.120887 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-khr9q_a3d6b459-d981-4cbe-8658-27860d930c81/extract-utilities/0.log" Oct 06 14:43:20 crc kubenswrapper[4867]: I1006 14:43:20.134840 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-khr9q_a3d6b459-d981-4cbe-8658-27860d930c81/extract-content/0.log" Oct 06 14:43:20 crc kubenswrapper[4867]: I1006 14:43:20.270348 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-khr9q_a3d6b459-d981-4cbe-8658-27860d930c81/extract-utilities/0.log" Oct 06 14:43:20 crc kubenswrapper[4867]: I1006 14:43:20.348579 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-khr9q_a3d6b459-d981-4cbe-8658-27860d930c81/extract-content/0.log" Oct 06 14:43:20 crc kubenswrapper[4867]: I1006 14:43:20.373194 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bpzds_1afa423a-dc5a-4b79-b4ea-5868f9ea04b3/extract-utilities/0.log" Oct 06 14:43:20 crc kubenswrapper[4867]: I1006 14:43:20.580609 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-khr9q_a3d6b459-d981-4cbe-8658-27860d930c81/registry-server/0.log" Oct 06 14:43:20 crc kubenswrapper[4867]: I1006 14:43:20.589405 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bpzds_1afa423a-dc5a-4b79-b4ea-5868f9ea04b3/extract-utilities/0.log" Oct 06 14:43:20 crc kubenswrapper[4867]: I1006 14:43:20.590547 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bpzds_1afa423a-dc5a-4b79-b4ea-5868f9ea04b3/extract-content/0.log" Oct 06 14:43:20 crc kubenswrapper[4867]: I1006 14:43:20.596349 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bpzds_1afa423a-dc5a-4b79-b4ea-5868f9ea04b3/extract-content/0.log" Oct 06 14:43:20 crc kubenswrapper[4867]: I1006 14:43:20.765157 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bpzds_1afa423a-dc5a-4b79-b4ea-5868f9ea04b3/extract-content/0.log" Oct 06 14:43:20 crc kubenswrapper[4867]: I1006 14:43:20.829655 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bpzds_1afa423a-dc5a-4b79-b4ea-5868f9ea04b3/extract-utilities/0.log" Oct 06 14:43:20 crc kubenswrapper[4867]: I1006 14:43:20.982357 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bpzds_1afa423a-dc5a-4b79-b4ea-5868f9ea04b3/registry-server/0.log" Oct 06 14:43:32 crc kubenswrapper[4867]: I1006 14:43:32.718221 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-rxd2f_1b226da8-0bf8-4ead-b308-6677288373a3/prometheus-operator/0.log" Oct 06 14:43:32 crc kubenswrapper[4867]: I1006 14:43:32.926381 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7c667696bd-98f7n_b47e5b18-abb6-4dc9-bc90-c37e31034183/prometheus-operator-admission-webhook/0.log" Oct 06 14:43:32 crc kubenswrapper[4867]: I1006 14:43:32.947600 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7c667696bd-blr54_4c780336-2ad2-49ef-97b4-0161e4dceb44/prometheus-operator-admission-webhook/0.log" Oct 06 14:43:33 crc kubenswrapper[4867]: I1006 14:43:33.136838 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-zm28w_cb9ae008-7e15-4aa1-84fa-93f513646286/perses-operator/0.log" Oct 06 14:43:33 crc kubenswrapper[4867]: I1006 14:43:33.137820 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-pkjkr_d4f4e099-818f-4e18-b1d2-dc026962eb51/operator/0.log" Oct 06 14:43:42 crc kubenswrapper[4867]: I1006 14:43:42.874087 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:43:42 crc kubenswrapper[4867]: I1006 14:43:42.874760 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:43:42 crc kubenswrapper[4867]: I1006 14:43:42.874815 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 14:43:42 crc kubenswrapper[4867]: I1006 14:43:42.875718 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 14:43:42 crc kubenswrapper[4867]: I1006 14:43:42.875772 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" gracePeriod=600 Oct 06 14:43:43 crc kubenswrapper[4867]: E1006 14:43:43.007590 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:43:43 crc kubenswrapper[4867]: I1006 14:43:43.623943 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" exitCode=0 Oct 06 14:43:43 crc kubenswrapper[4867]: I1006 14:43:43.624016 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b"} Oct 06 14:43:43 crc kubenswrapper[4867]: I1006 14:43:43.624074 4867 scope.go:117] "RemoveContainer" containerID="f9da5b3ec3f12078011440e5be465b37e5aa4d49c61b498acf56e70c5ba427e7" Oct 06 14:43:43 crc kubenswrapper[4867]: I1006 14:43:43.625333 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:43:43 crc kubenswrapper[4867]: E1006 14:43:43.625841 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:43:56 crc kubenswrapper[4867]: I1006 14:43:56.221953 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:43:56 crc kubenswrapper[4867]: E1006 14:43:56.222961 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:44:10 crc kubenswrapper[4867]: I1006 14:44:10.222441 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:44:10 crc kubenswrapper[4867]: E1006 14:44:10.223172 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:44:23 crc kubenswrapper[4867]: I1006 14:44:23.223028 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:44:23 crc kubenswrapper[4867]: E1006 14:44:23.224155 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:44:37 crc kubenswrapper[4867]: I1006 14:44:37.221651 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:44:37 crc kubenswrapper[4867]: E1006 14:44:37.222699 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:44:52 crc kubenswrapper[4867]: I1006 14:44:52.221875 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:44:52 crc kubenswrapper[4867]: E1006 14:44:52.223770 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.145523 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx"] Oct 06 14:45:00 crc kubenswrapper[4867]: E1006 14:45:00.146455 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3320dba0-67d6-49d0-85b8-f2510d40fa2e" containerName="container-00" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.146469 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3320dba0-67d6-49d0-85b8-f2510d40fa2e" containerName="container-00" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.146721 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3320dba0-67d6-49d0-85b8-f2510d40fa2e" containerName="container-00" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.147508 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.149851 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.151519 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.155856 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx"] Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.245370 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z766g\" (UniqueName: \"kubernetes.io/projected/d2a6928f-0616-4e9d-ad24-fe672ce186ab-kube-api-access-z766g\") pod \"collect-profiles-29329365-th4bx\" (UID: \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.245431 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2a6928f-0616-4e9d-ad24-fe672ce186ab-secret-volume\") pod \"collect-profiles-29329365-th4bx\" (UID: \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.245950 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2a6928f-0616-4e9d-ad24-fe672ce186ab-config-volume\") pod \"collect-profiles-29329365-th4bx\" (UID: \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.348480 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2a6928f-0616-4e9d-ad24-fe672ce186ab-config-volume\") pod \"collect-profiles-29329365-th4bx\" (UID: \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.348619 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z766g\" (UniqueName: \"kubernetes.io/projected/d2a6928f-0616-4e9d-ad24-fe672ce186ab-kube-api-access-z766g\") pod \"collect-profiles-29329365-th4bx\" (UID: \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.348662 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2a6928f-0616-4e9d-ad24-fe672ce186ab-secret-volume\") pod \"collect-profiles-29329365-th4bx\" (UID: \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.351311 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2a6928f-0616-4e9d-ad24-fe672ce186ab-config-volume\") pod \"collect-profiles-29329365-th4bx\" (UID: \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.367497 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2a6928f-0616-4e9d-ad24-fe672ce186ab-secret-volume\") pod \"collect-profiles-29329365-th4bx\" (UID: \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.367817 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z766g\" (UniqueName: \"kubernetes.io/projected/d2a6928f-0616-4e9d-ad24-fe672ce186ab-kube-api-access-z766g\") pod \"collect-profiles-29329365-th4bx\" (UID: \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.470898 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" Oct 06 14:45:00 crc kubenswrapper[4867]: I1006 14:45:00.898077 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx"] Oct 06 14:45:01 crc kubenswrapper[4867]: I1006 14:45:01.464583 4867 generic.go:334] "Generic (PLEG): container finished" podID="d2a6928f-0616-4e9d-ad24-fe672ce186ab" containerID="2eb7867c3904fb4e1040743a877824fc0c7e95a6fcd4c747515d7b3af976562e" exitCode=0 Oct 06 14:45:01 crc kubenswrapper[4867]: I1006 14:45:01.464629 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" event={"ID":"d2a6928f-0616-4e9d-ad24-fe672ce186ab","Type":"ContainerDied","Data":"2eb7867c3904fb4e1040743a877824fc0c7e95a6fcd4c747515d7b3af976562e"} Oct 06 14:45:01 crc kubenswrapper[4867]: I1006 14:45:01.464961 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" event={"ID":"d2a6928f-0616-4e9d-ad24-fe672ce186ab","Type":"ContainerStarted","Data":"f9bbd2cd4aa10169d633e0ed00938883b42e5797c4a0ab763c87dc930cb2e389"} Oct 06 14:45:02 crc kubenswrapper[4867]: I1006 14:45:02.773461 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" Oct 06 14:45:02 crc kubenswrapper[4867]: I1006 14:45:02.908684 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2a6928f-0616-4e9d-ad24-fe672ce186ab-config-volume\") pod \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\" (UID: \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\") " Oct 06 14:45:02 crc kubenswrapper[4867]: I1006 14:45:02.909006 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2a6928f-0616-4e9d-ad24-fe672ce186ab-secret-volume\") pod \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\" (UID: \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\") " Oct 06 14:45:02 crc kubenswrapper[4867]: I1006 14:45:02.909107 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z766g\" (UniqueName: \"kubernetes.io/projected/d2a6928f-0616-4e9d-ad24-fe672ce186ab-kube-api-access-z766g\") pod \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\" (UID: \"d2a6928f-0616-4e9d-ad24-fe672ce186ab\") " Oct 06 14:45:02 crc kubenswrapper[4867]: I1006 14:45:02.909498 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a6928f-0616-4e9d-ad24-fe672ce186ab-config-volume" (OuterVolumeSpecName: "config-volume") pod "d2a6928f-0616-4e9d-ad24-fe672ce186ab" (UID: "d2a6928f-0616-4e9d-ad24-fe672ce186ab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:45:02 crc kubenswrapper[4867]: I1006 14:45:02.914329 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a6928f-0616-4e9d-ad24-fe672ce186ab-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d2a6928f-0616-4e9d-ad24-fe672ce186ab" (UID: "d2a6928f-0616-4e9d-ad24-fe672ce186ab"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:45:02 crc kubenswrapper[4867]: I1006 14:45:02.914499 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2a6928f-0616-4e9d-ad24-fe672ce186ab-kube-api-access-z766g" (OuterVolumeSpecName: "kube-api-access-z766g") pod "d2a6928f-0616-4e9d-ad24-fe672ce186ab" (UID: "d2a6928f-0616-4e9d-ad24-fe672ce186ab"). InnerVolumeSpecName "kube-api-access-z766g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:45:03 crc kubenswrapper[4867]: I1006 14:45:03.011466 4867 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d2a6928f-0616-4e9d-ad24-fe672ce186ab-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 14:45:03 crc kubenswrapper[4867]: I1006 14:45:03.011513 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z766g\" (UniqueName: \"kubernetes.io/projected/d2a6928f-0616-4e9d-ad24-fe672ce186ab-kube-api-access-z766g\") on node \"crc\" DevicePath \"\"" Oct 06 14:45:03 crc kubenswrapper[4867]: I1006 14:45:03.011523 4867 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2a6928f-0616-4e9d-ad24-fe672ce186ab-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 14:45:03 crc kubenswrapper[4867]: I1006 14:45:03.220877 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:45:03 crc kubenswrapper[4867]: E1006 14:45:03.221174 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:45:03 crc kubenswrapper[4867]: I1006 14:45:03.486405 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" event={"ID":"d2a6928f-0616-4e9d-ad24-fe672ce186ab","Type":"ContainerDied","Data":"f9bbd2cd4aa10169d633e0ed00938883b42e5797c4a0ab763c87dc930cb2e389"} Oct 06 14:45:03 crc kubenswrapper[4867]: I1006 14:45:03.486444 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9bbd2cd4aa10169d633e0ed00938883b42e5797c4a0ab763c87dc930cb2e389" Oct 06 14:45:03 crc kubenswrapper[4867]: I1006 14:45:03.486494 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329365-th4bx" Oct 06 14:45:03 crc kubenswrapper[4867]: I1006 14:45:03.854236 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb"] Oct 06 14:45:03 crc kubenswrapper[4867]: I1006 14:45:03.862665 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329320-25nxb"] Oct 06 14:45:05 crc kubenswrapper[4867]: I1006 14:45:05.233900 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ad9940f-df1b-44d5-8982-30b35e8d2d3d" path="/var/lib/kubelet/pods/6ad9940f-df1b-44d5-8982-30b35e8d2d3d/volumes" Oct 06 14:45:16 crc kubenswrapper[4867]: I1006 14:45:16.221340 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:45:16 crc kubenswrapper[4867]: E1006 14:45:16.222351 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.007951 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-svmjp"] Oct 06 14:45:18 crc kubenswrapper[4867]: E1006 14:45:18.009168 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a6928f-0616-4e9d-ad24-fe672ce186ab" containerName="collect-profiles" Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.009186 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a6928f-0616-4e9d-ad24-fe672ce186ab" containerName="collect-profiles" Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.009433 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a6928f-0616-4e9d-ad24-fe672ce186ab" containerName="collect-profiles" Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.011101 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.019971 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-svmjp"] Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.182096 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-catalog-content\") pod \"redhat-marketplace-svmjp\" (UID: \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\") " pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.182184 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znpqd\" (UniqueName: \"kubernetes.io/projected/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-kube-api-access-znpqd\") pod \"redhat-marketplace-svmjp\" (UID: \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\") " pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.182338 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-utilities\") pod \"redhat-marketplace-svmjp\" (UID: \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\") " pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.284524 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-catalog-content\") pod \"redhat-marketplace-svmjp\" (UID: \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\") " pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.284627 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znpqd\" (UniqueName: \"kubernetes.io/projected/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-kube-api-access-znpqd\") pod \"redhat-marketplace-svmjp\" (UID: \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\") " pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.284779 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-utilities\") pod \"redhat-marketplace-svmjp\" (UID: \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\") " pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.285075 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-catalog-content\") pod \"redhat-marketplace-svmjp\" (UID: \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\") " pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.285318 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-utilities\") pod \"redhat-marketplace-svmjp\" (UID: \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\") " pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.305131 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znpqd\" (UniqueName: \"kubernetes.io/projected/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-kube-api-access-znpqd\") pod \"redhat-marketplace-svmjp\" (UID: \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\") " pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.339654 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:18 crc kubenswrapper[4867]: W1006 14:45:18.765290 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8445062_7af4_4fb2_aba9_0fa24e09a4d9.slice/crio-e07c5b15aa13aee052af3e5ee19e6af71215c961e0ec379beb95a6c2aa3eff8c WatchSource:0}: Error finding container e07c5b15aa13aee052af3e5ee19e6af71215c961e0ec379beb95a6c2aa3eff8c: Status 404 returned error can't find the container with id e07c5b15aa13aee052af3e5ee19e6af71215c961e0ec379beb95a6c2aa3eff8c Oct 06 14:45:18 crc kubenswrapper[4867]: I1006 14:45:18.767050 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-svmjp"] Oct 06 14:45:19 crc kubenswrapper[4867]: I1006 14:45:19.660927 4867 generic.go:334] "Generic (PLEG): container finished" podID="c8445062-7af4-4fb2-aba9-0fa24e09a4d9" containerID="4b8367675022c0e121d9fb8ee45043b62621943a46fa669c8b5d21aacd77813b" exitCode=0 Oct 06 14:45:19 crc kubenswrapper[4867]: I1006 14:45:19.661040 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svmjp" event={"ID":"c8445062-7af4-4fb2-aba9-0fa24e09a4d9","Type":"ContainerDied","Data":"4b8367675022c0e121d9fb8ee45043b62621943a46fa669c8b5d21aacd77813b"} Oct 06 14:45:19 crc kubenswrapper[4867]: I1006 14:45:19.661244 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svmjp" event={"ID":"c8445062-7af4-4fb2-aba9-0fa24e09a4d9","Type":"ContainerStarted","Data":"e07c5b15aa13aee052af3e5ee19e6af71215c961e0ec379beb95a6c2aa3eff8c"} Oct 06 14:45:19 crc kubenswrapper[4867]: I1006 14:45:19.663138 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 14:45:20 crc kubenswrapper[4867]: I1006 14:45:20.673921 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svmjp" event={"ID":"c8445062-7af4-4fb2-aba9-0fa24e09a4d9","Type":"ContainerStarted","Data":"a1447322ea374ee91956a2e2d1f116f35e29e9aef63418ad1084d98f311be605"} Oct 06 14:45:21 crc kubenswrapper[4867]: I1006 14:45:21.686783 4867 generic.go:334] "Generic (PLEG): container finished" podID="c8445062-7af4-4fb2-aba9-0fa24e09a4d9" containerID="a1447322ea374ee91956a2e2d1f116f35e29e9aef63418ad1084d98f311be605" exitCode=0 Oct 06 14:45:21 crc kubenswrapper[4867]: I1006 14:45:21.686829 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svmjp" event={"ID":"c8445062-7af4-4fb2-aba9-0fa24e09a4d9","Type":"ContainerDied","Data":"a1447322ea374ee91956a2e2d1f116f35e29e9aef63418ad1084d98f311be605"} Oct 06 14:45:22 crc kubenswrapper[4867]: I1006 14:45:22.698346 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svmjp" event={"ID":"c8445062-7af4-4fb2-aba9-0fa24e09a4d9","Type":"ContainerStarted","Data":"068d0902f3112d548da264c1093c16c9223c6298a9c80d53247da93cfeada999"} Oct 06 14:45:27 crc kubenswrapper[4867]: I1006 14:45:27.221690 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:45:27 crc kubenswrapper[4867]: E1006 14:45:27.222818 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:45:28 crc kubenswrapper[4867]: I1006 14:45:28.340200 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:28 crc kubenswrapper[4867]: I1006 14:45:28.340420 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:28 crc kubenswrapper[4867]: I1006 14:45:28.390415 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:28 crc kubenswrapper[4867]: I1006 14:45:28.410587 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-svmjp" podStartSLOduration=8.755670632 podStartE2EDuration="11.410562162s" podCreationTimestamp="2025-10-06 14:45:17 +0000 UTC" firstStartedPulling="2025-10-06 14:45:19.662902825 +0000 UTC m=+6099.120850969" lastFinishedPulling="2025-10-06 14:45:22.317794355 +0000 UTC m=+6101.775742499" observedRunningTime="2025-10-06 14:45:22.716925199 +0000 UTC m=+6102.174873373" watchObservedRunningTime="2025-10-06 14:45:28.410562162 +0000 UTC m=+6107.868510316" Oct 06 14:45:28 crc kubenswrapper[4867]: I1006 14:45:28.817975 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:28 crc kubenswrapper[4867]: I1006 14:45:28.878145 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-svmjp"] Oct 06 14:45:30 crc kubenswrapper[4867]: I1006 14:45:30.781962 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-svmjp" podUID="c8445062-7af4-4fb2-aba9-0fa24e09a4d9" containerName="registry-server" containerID="cri-o://068d0902f3112d548da264c1093c16c9223c6298a9c80d53247da93cfeada999" gracePeriod=2 Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.229646 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.406086 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-catalog-content\") pod \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\" (UID: \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\") " Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.406343 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-utilities\") pod \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\" (UID: \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\") " Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.406376 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znpqd\" (UniqueName: \"kubernetes.io/projected/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-kube-api-access-znpqd\") pod \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\" (UID: \"c8445062-7af4-4fb2-aba9-0fa24e09a4d9\") " Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.407837 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-utilities" (OuterVolumeSpecName: "utilities") pod "c8445062-7af4-4fb2-aba9-0fa24e09a4d9" (UID: "c8445062-7af4-4fb2-aba9-0fa24e09a4d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.413899 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-kube-api-access-znpqd" (OuterVolumeSpecName: "kube-api-access-znpqd") pod "c8445062-7af4-4fb2-aba9-0fa24e09a4d9" (UID: "c8445062-7af4-4fb2-aba9-0fa24e09a4d9"). InnerVolumeSpecName "kube-api-access-znpqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.422391 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8445062-7af4-4fb2-aba9-0fa24e09a4d9" (UID: "c8445062-7af4-4fb2-aba9-0fa24e09a4d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.509493 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.509537 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znpqd\" (UniqueName: \"kubernetes.io/projected/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-kube-api-access-znpqd\") on node \"crc\" DevicePath \"\"" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.509550 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8445062-7af4-4fb2-aba9-0fa24e09a4d9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.794104 4867 generic.go:334] "Generic (PLEG): container finished" podID="c8445062-7af4-4fb2-aba9-0fa24e09a4d9" containerID="068d0902f3112d548da264c1093c16c9223c6298a9c80d53247da93cfeada999" exitCode=0 Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.794154 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svmjp" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.794158 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svmjp" event={"ID":"c8445062-7af4-4fb2-aba9-0fa24e09a4d9","Type":"ContainerDied","Data":"068d0902f3112d548da264c1093c16c9223c6298a9c80d53247da93cfeada999"} Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.794281 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svmjp" event={"ID":"c8445062-7af4-4fb2-aba9-0fa24e09a4d9","Type":"ContainerDied","Data":"e07c5b15aa13aee052af3e5ee19e6af71215c961e0ec379beb95a6c2aa3eff8c"} Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.794311 4867 scope.go:117] "RemoveContainer" containerID="068d0902f3112d548da264c1093c16c9223c6298a9c80d53247da93cfeada999" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.845457 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-svmjp"] Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.845650 4867 scope.go:117] "RemoveContainer" containerID="a1447322ea374ee91956a2e2d1f116f35e29e9aef63418ad1084d98f311be605" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.864042 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-svmjp"] Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.867763 4867 scope.go:117] "RemoveContainer" containerID="4b8367675022c0e121d9fb8ee45043b62621943a46fa669c8b5d21aacd77813b" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.913754 4867 scope.go:117] "RemoveContainer" containerID="068d0902f3112d548da264c1093c16c9223c6298a9c80d53247da93cfeada999" Oct 06 14:45:31 crc kubenswrapper[4867]: E1006 14:45:31.914246 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"068d0902f3112d548da264c1093c16c9223c6298a9c80d53247da93cfeada999\": container with ID starting with 068d0902f3112d548da264c1093c16c9223c6298a9c80d53247da93cfeada999 not found: ID does not exist" containerID="068d0902f3112d548da264c1093c16c9223c6298a9c80d53247da93cfeada999" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.914313 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"068d0902f3112d548da264c1093c16c9223c6298a9c80d53247da93cfeada999"} err="failed to get container status \"068d0902f3112d548da264c1093c16c9223c6298a9c80d53247da93cfeada999\": rpc error: code = NotFound desc = could not find container \"068d0902f3112d548da264c1093c16c9223c6298a9c80d53247da93cfeada999\": container with ID starting with 068d0902f3112d548da264c1093c16c9223c6298a9c80d53247da93cfeada999 not found: ID does not exist" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.914347 4867 scope.go:117] "RemoveContainer" containerID="a1447322ea374ee91956a2e2d1f116f35e29e9aef63418ad1084d98f311be605" Oct 06 14:45:31 crc kubenswrapper[4867]: E1006 14:45:31.914788 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1447322ea374ee91956a2e2d1f116f35e29e9aef63418ad1084d98f311be605\": container with ID starting with a1447322ea374ee91956a2e2d1f116f35e29e9aef63418ad1084d98f311be605 not found: ID does not exist" containerID="a1447322ea374ee91956a2e2d1f116f35e29e9aef63418ad1084d98f311be605" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.914836 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1447322ea374ee91956a2e2d1f116f35e29e9aef63418ad1084d98f311be605"} err="failed to get container status \"a1447322ea374ee91956a2e2d1f116f35e29e9aef63418ad1084d98f311be605\": rpc error: code = NotFound desc = could not find container \"a1447322ea374ee91956a2e2d1f116f35e29e9aef63418ad1084d98f311be605\": container with ID starting with a1447322ea374ee91956a2e2d1f116f35e29e9aef63418ad1084d98f311be605 not found: ID does not exist" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.914864 4867 scope.go:117] "RemoveContainer" containerID="4b8367675022c0e121d9fb8ee45043b62621943a46fa669c8b5d21aacd77813b" Oct 06 14:45:31 crc kubenswrapper[4867]: E1006 14:45:31.915180 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b8367675022c0e121d9fb8ee45043b62621943a46fa669c8b5d21aacd77813b\": container with ID starting with 4b8367675022c0e121d9fb8ee45043b62621943a46fa669c8b5d21aacd77813b not found: ID does not exist" containerID="4b8367675022c0e121d9fb8ee45043b62621943a46fa669c8b5d21aacd77813b" Oct 06 14:45:31 crc kubenswrapper[4867]: I1006 14:45:31.915234 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b8367675022c0e121d9fb8ee45043b62621943a46fa669c8b5d21aacd77813b"} err="failed to get container status \"4b8367675022c0e121d9fb8ee45043b62621943a46fa669c8b5d21aacd77813b\": rpc error: code = NotFound desc = could not find container \"4b8367675022c0e121d9fb8ee45043b62621943a46fa669c8b5d21aacd77813b\": container with ID starting with 4b8367675022c0e121d9fb8ee45043b62621943a46fa669c8b5d21aacd77813b not found: ID does not exist" Oct 06 14:45:33 crc kubenswrapper[4867]: I1006 14:45:33.245457 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8445062-7af4-4fb2-aba9-0fa24e09a4d9" path="/var/lib/kubelet/pods/c8445062-7af4-4fb2-aba9-0fa24e09a4d9/volumes" Oct 06 14:45:40 crc kubenswrapper[4867]: I1006 14:45:40.222847 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:45:40 crc kubenswrapper[4867]: E1006 14:45:40.224830 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:45:55 crc kubenswrapper[4867]: I1006 14:45:55.036974 4867 generic.go:334] "Generic (PLEG): container finished" podID="ac97278f-c2d6-4bb4-849d-7e2024d818bb" containerID="9f77c77d9961795d391b57ee71fdc40d16ffd45467d81836b33591ced95c58df" exitCode=0 Oct 06 14:45:55 crc kubenswrapper[4867]: I1006 14:45:55.037064 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t7lp9/must-gather-7rw7n" event={"ID":"ac97278f-c2d6-4bb4-849d-7e2024d818bb","Type":"ContainerDied","Data":"9f77c77d9961795d391b57ee71fdc40d16ffd45467d81836b33591ced95c58df"} Oct 06 14:45:55 crc kubenswrapper[4867]: I1006 14:45:55.038586 4867 scope.go:117] "RemoveContainer" containerID="9f77c77d9961795d391b57ee71fdc40d16ffd45467d81836b33591ced95c58df" Oct 06 14:45:55 crc kubenswrapper[4867]: I1006 14:45:55.221430 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:45:55 crc kubenswrapper[4867]: E1006 14:45:55.221722 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:45:55 crc kubenswrapper[4867]: I1006 14:45:55.497840 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t7lp9_must-gather-7rw7n_ac97278f-c2d6-4bb4-849d-7e2024d818bb/gather/0.log" Oct 06 14:46:03 crc kubenswrapper[4867]: I1006 14:46:03.849322 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t7lp9/must-gather-7rw7n"] Oct 06 14:46:03 crc kubenswrapper[4867]: I1006 14:46:03.850396 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-t7lp9/must-gather-7rw7n" podUID="ac97278f-c2d6-4bb4-849d-7e2024d818bb" containerName="copy" containerID="cri-o://ee292c5efc01c94ca7c254211d4a9fb23fdf6a0958a5ea61dec194b1d131b213" gracePeriod=2 Oct 06 14:46:03 crc kubenswrapper[4867]: I1006 14:46:03.880715 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t7lp9/must-gather-7rw7n"] Oct 06 14:46:04 crc kubenswrapper[4867]: I1006 14:46:04.139130 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t7lp9_must-gather-7rw7n_ac97278f-c2d6-4bb4-849d-7e2024d818bb/copy/0.log" Oct 06 14:46:04 crc kubenswrapper[4867]: I1006 14:46:04.139686 4867 generic.go:334] "Generic (PLEG): container finished" podID="ac97278f-c2d6-4bb4-849d-7e2024d818bb" containerID="ee292c5efc01c94ca7c254211d4a9fb23fdf6a0958a5ea61dec194b1d131b213" exitCode=143 Oct 06 14:46:04 crc kubenswrapper[4867]: I1006 14:46:04.339378 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t7lp9_must-gather-7rw7n_ac97278f-c2d6-4bb4-849d-7e2024d818bb/copy/0.log" Oct 06 14:46:04 crc kubenswrapper[4867]: I1006 14:46:04.340021 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/must-gather-7rw7n" Oct 06 14:46:04 crc kubenswrapper[4867]: I1006 14:46:04.434014 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpx6l\" (UniqueName: \"kubernetes.io/projected/ac97278f-c2d6-4bb4-849d-7e2024d818bb-kube-api-access-fpx6l\") pod \"ac97278f-c2d6-4bb4-849d-7e2024d818bb\" (UID: \"ac97278f-c2d6-4bb4-849d-7e2024d818bb\") " Oct 06 14:46:04 crc kubenswrapper[4867]: I1006 14:46:04.434132 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ac97278f-c2d6-4bb4-849d-7e2024d818bb-must-gather-output\") pod \"ac97278f-c2d6-4bb4-849d-7e2024d818bb\" (UID: \"ac97278f-c2d6-4bb4-849d-7e2024d818bb\") " Oct 06 14:46:04 crc kubenswrapper[4867]: I1006 14:46:04.451913 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac97278f-c2d6-4bb4-849d-7e2024d818bb-kube-api-access-fpx6l" (OuterVolumeSpecName: "kube-api-access-fpx6l") pod "ac97278f-c2d6-4bb4-849d-7e2024d818bb" (UID: "ac97278f-c2d6-4bb4-849d-7e2024d818bb"). InnerVolumeSpecName "kube-api-access-fpx6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:46:04 crc kubenswrapper[4867]: I1006 14:46:04.536814 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpx6l\" (UniqueName: \"kubernetes.io/projected/ac97278f-c2d6-4bb4-849d-7e2024d818bb-kube-api-access-fpx6l\") on node \"crc\" DevicePath \"\"" Oct 06 14:46:04 crc kubenswrapper[4867]: I1006 14:46:04.628938 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac97278f-c2d6-4bb4-849d-7e2024d818bb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ac97278f-c2d6-4bb4-849d-7e2024d818bb" (UID: "ac97278f-c2d6-4bb4-849d-7e2024d818bb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:46:04 crc kubenswrapper[4867]: I1006 14:46:04.638914 4867 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ac97278f-c2d6-4bb4-849d-7e2024d818bb-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 06 14:46:04 crc kubenswrapper[4867]: I1006 14:46:04.660478 4867 scope.go:117] "RemoveContainer" containerID="ee292c5efc01c94ca7c254211d4a9fb23fdf6a0958a5ea61dec194b1d131b213" Oct 06 14:46:04 crc kubenswrapper[4867]: I1006 14:46:04.684454 4867 scope.go:117] "RemoveContainer" containerID="9f77c77d9961795d391b57ee71fdc40d16ffd45467d81836b33591ced95c58df" Oct 06 14:46:04 crc kubenswrapper[4867]: I1006 14:46:04.755969 4867 scope.go:117] "RemoveContainer" containerID="b50414276136ecd6e8be8aecf45ecd681137ddaed6545e5db169d8df44e744e4" Oct 06 14:46:04 crc kubenswrapper[4867]: I1006 14:46:04.773643 4867 scope.go:117] "RemoveContainer" containerID="7b394c8e51551c7e87c17d42d0a568a6a6e23b64986c0fa2bef42449bf5e2302" Oct 06 14:46:05 crc kubenswrapper[4867]: I1006 14:46:05.147875 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t7lp9/must-gather-7rw7n" Oct 06 14:46:05 crc kubenswrapper[4867]: I1006 14:46:05.231585 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac97278f-c2d6-4bb4-849d-7e2024d818bb" path="/var/lib/kubelet/pods/ac97278f-c2d6-4bb4-849d-7e2024d818bb/volumes" Oct 06 14:46:09 crc kubenswrapper[4867]: I1006 14:46:09.222184 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:46:09 crc kubenswrapper[4867]: E1006 14:46:09.222947 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:46:22 crc kubenswrapper[4867]: I1006 14:46:22.221775 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:46:22 crc kubenswrapper[4867]: E1006 14:46:22.222557 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.699333 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fq2cw/must-gather-7xspx"] Oct 06 14:46:30 crc kubenswrapper[4867]: E1006 14:46:30.700408 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8445062-7af4-4fb2-aba9-0fa24e09a4d9" containerName="extract-utilities" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.700424 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8445062-7af4-4fb2-aba9-0fa24e09a4d9" containerName="extract-utilities" Oct 06 14:46:30 crc kubenswrapper[4867]: E1006 14:46:30.700456 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac97278f-c2d6-4bb4-849d-7e2024d818bb" containerName="gather" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.700464 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac97278f-c2d6-4bb4-849d-7e2024d818bb" containerName="gather" Oct 06 14:46:30 crc kubenswrapper[4867]: E1006 14:46:30.700473 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8445062-7af4-4fb2-aba9-0fa24e09a4d9" containerName="registry-server" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.700480 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8445062-7af4-4fb2-aba9-0fa24e09a4d9" containerName="registry-server" Oct 06 14:46:30 crc kubenswrapper[4867]: E1006 14:46:30.700494 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8445062-7af4-4fb2-aba9-0fa24e09a4d9" containerName="extract-content" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.700502 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8445062-7af4-4fb2-aba9-0fa24e09a4d9" containerName="extract-content" Oct 06 14:46:30 crc kubenswrapper[4867]: E1006 14:46:30.700519 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac97278f-c2d6-4bb4-849d-7e2024d818bb" containerName="copy" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.700526 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac97278f-c2d6-4bb4-849d-7e2024d818bb" containerName="copy" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.700763 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac97278f-c2d6-4bb4-849d-7e2024d818bb" containerName="gather" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.700792 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8445062-7af4-4fb2-aba9-0fa24e09a4d9" containerName="registry-server" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.700807 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac97278f-c2d6-4bb4-849d-7e2024d818bb" containerName="copy" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.702124 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/must-gather-7xspx" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.707346 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fq2cw"/"openshift-service-ca.crt" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.708882 4867 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fq2cw"/"default-dockercfg-r5k9b" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.717908 4867 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fq2cw"/"kube-root-ca.crt" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.742989 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fq2cw/must-gather-7xspx"] Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.793498 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxqzl\" (UniqueName: \"kubernetes.io/projected/0c0c2dd3-6303-4a00-be14-8af305c08842-kube-api-access-kxqzl\") pod \"must-gather-7xspx\" (UID: \"0c0c2dd3-6303-4a00-be14-8af305c08842\") " pod="openshift-must-gather-fq2cw/must-gather-7xspx" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.793615 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0c0c2dd3-6303-4a00-be14-8af305c08842-must-gather-output\") pod \"must-gather-7xspx\" (UID: \"0c0c2dd3-6303-4a00-be14-8af305c08842\") " pod="openshift-must-gather-fq2cw/must-gather-7xspx" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.896662 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxqzl\" (UniqueName: \"kubernetes.io/projected/0c0c2dd3-6303-4a00-be14-8af305c08842-kube-api-access-kxqzl\") pod \"must-gather-7xspx\" (UID: \"0c0c2dd3-6303-4a00-be14-8af305c08842\") " pod="openshift-must-gather-fq2cw/must-gather-7xspx" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.896768 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0c0c2dd3-6303-4a00-be14-8af305c08842-must-gather-output\") pod \"must-gather-7xspx\" (UID: \"0c0c2dd3-6303-4a00-be14-8af305c08842\") " pod="openshift-must-gather-fq2cw/must-gather-7xspx" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.897295 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0c0c2dd3-6303-4a00-be14-8af305c08842-must-gather-output\") pod \"must-gather-7xspx\" (UID: \"0c0c2dd3-6303-4a00-be14-8af305c08842\") " pod="openshift-must-gather-fq2cw/must-gather-7xspx" Oct 06 14:46:30 crc kubenswrapper[4867]: I1006 14:46:30.927191 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxqzl\" (UniqueName: \"kubernetes.io/projected/0c0c2dd3-6303-4a00-be14-8af305c08842-kube-api-access-kxqzl\") pod \"must-gather-7xspx\" (UID: \"0c0c2dd3-6303-4a00-be14-8af305c08842\") " pod="openshift-must-gather-fq2cw/must-gather-7xspx" Oct 06 14:46:31 crc kubenswrapper[4867]: I1006 14:46:31.020007 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/must-gather-7xspx" Oct 06 14:46:31 crc kubenswrapper[4867]: I1006 14:46:31.653053 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fq2cw/must-gather-7xspx"] Oct 06 14:46:32 crc kubenswrapper[4867]: I1006 14:46:32.432110 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq2cw/must-gather-7xspx" event={"ID":"0c0c2dd3-6303-4a00-be14-8af305c08842","Type":"ContainerStarted","Data":"f20472ab52eeba7cd39ec5a95b4f98cc2bd3582cef1c37987d053fa461cdffc2"} Oct 06 14:46:32 crc kubenswrapper[4867]: I1006 14:46:32.432698 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq2cw/must-gather-7xspx" event={"ID":"0c0c2dd3-6303-4a00-be14-8af305c08842","Type":"ContainerStarted","Data":"0dbc7aa1c2d805835e66377ce807601a751aa5336dc091b3a8d6c8bbb98ca668"} Oct 06 14:46:32 crc kubenswrapper[4867]: I1006 14:46:32.432723 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq2cw/must-gather-7xspx" event={"ID":"0c0c2dd3-6303-4a00-be14-8af305c08842","Type":"ContainerStarted","Data":"24736296695e6ff3e5ba42aa0770442fd4ab79abb39054526722e352acf818a9"} Oct 06 14:46:32 crc kubenswrapper[4867]: I1006 14:46:32.458072 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fq2cw/must-gather-7xspx" podStartSLOduration=2.458049356 podStartE2EDuration="2.458049356s" podCreationTimestamp="2025-10-06 14:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:46:32.455365024 +0000 UTC m=+6171.913313178" watchObservedRunningTime="2025-10-06 14:46:32.458049356 +0000 UTC m=+6171.915997500" Oct 06 14:46:34 crc kubenswrapper[4867]: I1006 14:46:34.221865 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:46:34 crc kubenswrapper[4867]: E1006 14:46:34.222527 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:46:35 crc kubenswrapper[4867]: I1006 14:46:35.604203 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fq2cw/crc-debug-lz7lx"] Oct 06 14:46:35 crc kubenswrapper[4867]: I1006 14:46:35.606158 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/crc-debug-lz7lx" Oct 06 14:46:35 crc kubenswrapper[4867]: I1006 14:46:35.705587 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsrvc\" (UniqueName: \"kubernetes.io/projected/cebf5a2c-4133-4007-8813-5d872f3ef665-kube-api-access-hsrvc\") pod \"crc-debug-lz7lx\" (UID: \"cebf5a2c-4133-4007-8813-5d872f3ef665\") " pod="openshift-must-gather-fq2cw/crc-debug-lz7lx" Oct 06 14:46:35 crc kubenswrapper[4867]: I1006 14:46:35.705715 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cebf5a2c-4133-4007-8813-5d872f3ef665-host\") pod \"crc-debug-lz7lx\" (UID: \"cebf5a2c-4133-4007-8813-5d872f3ef665\") " pod="openshift-must-gather-fq2cw/crc-debug-lz7lx" Oct 06 14:46:35 crc kubenswrapper[4867]: I1006 14:46:35.807300 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cebf5a2c-4133-4007-8813-5d872f3ef665-host\") pod \"crc-debug-lz7lx\" (UID: \"cebf5a2c-4133-4007-8813-5d872f3ef665\") " pod="openshift-must-gather-fq2cw/crc-debug-lz7lx" Oct 06 14:46:35 crc kubenswrapper[4867]: I1006 14:46:35.807829 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsrvc\" (UniqueName: \"kubernetes.io/projected/cebf5a2c-4133-4007-8813-5d872f3ef665-kube-api-access-hsrvc\") pod \"crc-debug-lz7lx\" (UID: \"cebf5a2c-4133-4007-8813-5d872f3ef665\") " pod="openshift-must-gather-fq2cw/crc-debug-lz7lx" Oct 06 14:46:35 crc kubenswrapper[4867]: I1006 14:46:35.807689 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cebf5a2c-4133-4007-8813-5d872f3ef665-host\") pod \"crc-debug-lz7lx\" (UID: \"cebf5a2c-4133-4007-8813-5d872f3ef665\") " pod="openshift-must-gather-fq2cw/crc-debug-lz7lx" Oct 06 14:46:35 crc kubenswrapper[4867]: I1006 14:46:35.828103 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsrvc\" (UniqueName: \"kubernetes.io/projected/cebf5a2c-4133-4007-8813-5d872f3ef665-kube-api-access-hsrvc\") pod \"crc-debug-lz7lx\" (UID: \"cebf5a2c-4133-4007-8813-5d872f3ef665\") " pod="openshift-must-gather-fq2cw/crc-debug-lz7lx" Oct 06 14:46:35 crc kubenswrapper[4867]: I1006 14:46:35.927657 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/crc-debug-lz7lx" Oct 06 14:46:35 crc kubenswrapper[4867]: W1006 14:46:35.953486 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcebf5a2c_4133_4007_8813_5d872f3ef665.slice/crio-287d7f32221b4f2d799494dce260f81bbf96e581b65bad31f8adc856a062711b WatchSource:0}: Error finding container 287d7f32221b4f2d799494dce260f81bbf96e581b65bad31f8adc856a062711b: Status 404 returned error can't find the container with id 287d7f32221b4f2d799494dce260f81bbf96e581b65bad31f8adc856a062711b Oct 06 14:46:36 crc kubenswrapper[4867]: I1006 14:46:36.475702 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq2cw/crc-debug-lz7lx" event={"ID":"cebf5a2c-4133-4007-8813-5d872f3ef665","Type":"ContainerStarted","Data":"e2d1be4b07dad6ac43aff080214467294bc18ee6cda63e9b7af43f4c4e50ab11"} Oct 06 14:46:36 crc kubenswrapper[4867]: I1006 14:46:36.476020 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq2cw/crc-debug-lz7lx" event={"ID":"cebf5a2c-4133-4007-8813-5d872f3ef665","Type":"ContainerStarted","Data":"287d7f32221b4f2d799494dce260f81bbf96e581b65bad31f8adc856a062711b"} Oct 06 14:46:36 crc kubenswrapper[4867]: I1006 14:46:36.499673 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fq2cw/crc-debug-lz7lx" podStartSLOduration=1.499653748 podStartE2EDuration="1.499653748s" podCreationTimestamp="2025-10-06 14:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:46:36.491910391 +0000 UTC m=+6175.949858535" watchObservedRunningTime="2025-10-06 14:46:36.499653748 +0000 UTC m=+6175.957601892" Oct 06 14:46:48 crc kubenswrapper[4867]: I1006 14:46:48.221940 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:46:48 crc kubenswrapper[4867]: E1006 14:46:48.223180 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:47:01 crc kubenswrapper[4867]: I1006 14:47:01.231425 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:47:01 crc kubenswrapper[4867]: E1006 14:47:01.232646 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:47:14 crc kubenswrapper[4867]: I1006 14:47:14.222062 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:47:14 crc kubenswrapper[4867]: E1006 14:47:14.223015 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:47:25 crc kubenswrapper[4867]: I1006 14:47:25.222372 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:47:25 crc kubenswrapper[4867]: E1006 14:47:25.223372 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:47:37 crc kubenswrapper[4867]: I1006 14:47:37.221657 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:47:37 crc kubenswrapper[4867]: E1006 14:47:37.222481 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:47:50 crc kubenswrapper[4867]: I1006 14:47:50.221395 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:47:50 crc kubenswrapper[4867]: E1006 14:47:50.223035 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:47:59 crc kubenswrapper[4867]: I1006 14:47:59.490169 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-86d4db6f74-khhjk_a72dc3e7-d107-4153-9ce3-b092369b5d66/barbican-api/0.log" Oct 06 14:47:59 crc kubenswrapper[4867]: I1006 14:47:59.585871 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-86d4db6f74-khhjk_a72dc3e7-d107-4153-9ce3-b092369b5d66/barbican-api-log/0.log" Oct 06 14:47:59 crc kubenswrapper[4867]: I1006 14:47:59.720544 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f64dc9ddb-5ltwp_cdf0758d-d2e6-4660-8b3b-677c5febec8f/barbican-keystone-listener/0.log" Oct 06 14:47:59 crc kubenswrapper[4867]: I1006 14:47:59.818407 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f64dc9ddb-5ltwp_cdf0758d-d2e6-4660-8b3b-677c5febec8f/barbican-keystone-listener-log/0.log" Oct 06 14:47:59 crc kubenswrapper[4867]: I1006 14:47:59.929508 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5fc4fffc87-p6rts_f6057ffc-7d15-4097-b9d2-677fa9e69920/barbican-worker/0.log" Oct 06 14:48:00 crc kubenswrapper[4867]: I1006 14:48:00.013399 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5fc4fffc87-p6rts_f6057ffc-7d15-4097-b9d2-677fa9e69920/barbican-worker-log/0.log" Oct 06 14:48:00 crc kubenswrapper[4867]: I1006 14:48:00.159789 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-kflv9_e7ba5c1b-0dcb-4509-bb81-4bda347944bf/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:00 crc kubenswrapper[4867]: I1006 14:48:00.420962 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5383175c-d1e2-4f75-a9b6-5986e5062009/ceilometer-notification-agent/0.log" Oct 06 14:48:00 crc kubenswrapper[4867]: I1006 14:48:00.438966 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5383175c-d1e2-4f75-a9b6-5986e5062009/ceilometer-central-agent/0.log" Oct 06 14:48:00 crc kubenswrapper[4867]: I1006 14:48:00.457949 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5383175c-d1e2-4f75-a9b6-5986e5062009/proxy-httpd/0.log" Oct 06 14:48:00 crc kubenswrapper[4867]: I1006 14:48:00.583203 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5383175c-d1e2-4f75-a9b6-5986e5062009/sg-core/0.log" Oct 06 14:48:00 crc kubenswrapper[4867]: I1006 14:48:00.867803 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_678b77f0-1e51-4788-a7e0-4bc2560a9c6a/cinder-api-log/0.log" Oct 06 14:48:01 crc kubenswrapper[4867]: I1006 14:48:01.067579 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_678b77f0-1e51-4788-a7e0-4bc2560a9c6a/cinder-api/0.log" Oct 06 14:48:01 crc kubenswrapper[4867]: I1006 14:48:01.192408 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a3850291-2d24-472c-9ef2-7f2814c4c321/cinder-scheduler/0.log" Oct 06 14:48:01 crc kubenswrapper[4867]: I1006 14:48:01.282832 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a3850291-2d24-472c-9ef2-7f2814c4c321/probe/0.log" Oct 06 14:48:01 crc kubenswrapper[4867]: I1006 14:48:01.446749 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hbpqg_68b10f2c-285a-4492-90c0-1a3d83ab46e7/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:01 crc kubenswrapper[4867]: I1006 14:48:01.589293 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7zljl_cfcaa47d-7ad1-422b-8c1b-b4aa8d59218c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:01 crc kubenswrapper[4867]: I1006 14:48:01.795511 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mvgb9_8b54a7c9-430b-4dfc-9ffb-ae3c790372ee/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:01 crc kubenswrapper[4867]: I1006 14:48:01.892170 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c48bdb645-wtbz6_662008a2-cb52-48d6-bd6e-7e1c9bd511cf/init/0.log" Oct 06 14:48:02 crc kubenswrapper[4867]: I1006 14:48:02.067194 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c48bdb645-wtbz6_662008a2-cb52-48d6-bd6e-7e1c9bd511cf/init/0.log" Oct 06 14:48:02 crc kubenswrapper[4867]: I1006 14:48:02.235753 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c48bdb645-wtbz6_662008a2-cb52-48d6-bd6e-7e1c9bd511cf/dnsmasq-dns/0.log" Oct 06 14:48:02 crc kubenswrapper[4867]: I1006 14:48:02.336096 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-tb7f5_b214a0d6-e528-435a-9126-04d18492d264/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:02 crc kubenswrapper[4867]: I1006 14:48:02.435911 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fbbe30dd-179b-4e2b-b011-b395c30e32a9/glance-httpd/0.log" Oct 06 14:48:02 crc kubenswrapper[4867]: I1006 14:48:02.531264 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fbbe30dd-179b-4e2b-b011-b395c30e32a9/glance-log/0.log" Oct 06 14:48:02 crc kubenswrapper[4867]: I1006 14:48:02.658222 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_fe3366d3-09d4-49fb-a388-3291fe1e65b0/glance-httpd/0.log" Oct 06 14:48:02 crc kubenswrapper[4867]: I1006 14:48:02.728593 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_fe3366d3-09d4-49fb-a388-3291fe1e65b0/glance-log/0.log" Oct 06 14:48:02 crc kubenswrapper[4867]: I1006 14:48:02.904012 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-69d5cf7ffb-c2rgt_d7e92d5c-74ed-47bc-995a-d3712014f109/horizon/0.log" Oct 06 14:48:03 crc kubenswrapper[4867]: I1006 14:48:03.174667 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-v7cdf_4c656316-c726-4675-9209-cf119811bc63/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:03 crc kubenswrapper[4867]: I1006 14:48:03.249485 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-qbxjs_828fbe77-2fb8-4ed5-b64f-733c1dad834d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:03 crc kubenswrapper[4867]: I1006 14:48:03.451427 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29329321-p9bh2_ceb3352a-f644-4721-9e76-8c27cb9e26ac/keystone-cron/0.log" Oct 06 14:48:03 crc kubenswrapper[4867]: I1006 14:48:03.747241 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-69d5cf7ffb-c2rgt_d7e92d5c-74ed-47bc-995a-d3712014f109/horizon-log/0.log" Oct 06 14:48:03 crc kubenswrapper[4867]: I1006 14:48:03.820058 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_feb72ccb-56bd-433d-b82c-6002fed1e09d/kube-state-metrics/0.log" Oct 06 14:48:04 crc kubenswrapper[4867]: I1006 14:48:04.077314 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rqmz9_019fe1d6-0cfb-4b47-93bc-d08b1bd0f4be/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:04 crc kubenswrapper[4867]: I1006 14:48:04.221532 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:48:04 crc kubenswrapper[4867]: E1006 14:48:04.221817 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:48:04 crc kubenswrapper[4867]: I1006 14:48:04.238226 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-dcf7c7d6f-dz9mk_dd2a23fb-89c2-4a8c-b670-3f8330f13265/keystone-api/0.log" Oct 06 14:48:04 crc kubenswrapper[4867]: I1006 14:48:04.756424 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c47455745-hd5zg_81a1b704-8648-453e-b052-9a2721cf9830/neutron-httpd/0.log" Oct 06 14:48:04 crc kubenswrapper[4867]: I1006 14:48:04.796240 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c47455745-hd5zg_81a1b704-8648-453e-b052-9a2721cf9830/neutron-api/0.log" Oct 06 14:48:04 crc kubenswrapper[4867]: I1006 14:48:04.860412 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-lvbj2_081b2d1c-3691-40fb-8fde-05e44428087d/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:04 crc kubenswrapper[4867]: I1006 14:48:04.918139 4867 scope.go:117] "RemoveContainer" containerID="43999a7de2b8d328e85122160ce7c5a27b3e27e8d4b32b367b61e4ed7e5229c8" Oct 06 14:48:05 crc kubenswrapper[4867]: I1006 14:48:05.953631 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a1ce9788-66bb-464a-8cb4-a28f43e4228f/nova-cell0-conductor-conductor/0.log" Oct 06 14:48:06 crc kubenswrapper[4867]: I1006 14:48:06.757678 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3668fba3-af0f-478b-a41b-5de304592f65/nova-cell1-conductor-conductor/0.log" Oct 06 14:48:07 crc kubenswrapper[4867]: I1006 14:48:07.068427 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_165ccfff-2554-4af2-8ca4-be0c49e7daa8/nova-api-log/0.log" Oct 06 14:48:07 crc kubenswrapper[4867]: I1006 14:48:07.484145 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_38e2bbc3-d543-4521-bc10-88635228f1a9/nova-cell1-novncproxy-novncproxy/0.log" Oct 06 14:48:07 crc kubenswrapper[4867]: I1006 14:48:07.681793 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_165ccfff-2554-4af2-8ca4-be0c49e7daa8/nova-api-api/0.log" Oct 06 14:48:07 crc kubenswrapper[4867]: I1006 14:48:07.697852 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jgjsw_33622851-83c0-48c9-969d-99f96fbcb64f/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:08 crc kubenswrapper[4867]: I1006 14:48:08.082136 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_299fc545-42f8-4889-8775-57b7aed64736/nova-metadata-log/0.log" Oct 06 14:48:08 crc kubenswrapper[4867]: I1006 14:48:08.544723 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_acd2b7ce-fe29-4b71-b730-7b1212f4416d/mysql-bootstrap/0.log" Oct 06 14:48:08 crc kubenswrapper[4867]: I1006 14:48:08.745919 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d90a63ca-3da5-420b-b2ac-b17f116f0c84/nova-scheduler-scheduler/0.log" Oct 06 14:48:08 crc kubenswrapper[4867]: I1006 14:48:08.830041 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_acd2b7ce-fe29-4b71-b730-7b1212f4416d/mysql-bootstrap/0.log" Oct 06 14:48:08 crc kubenswrapper[4867]: I1006 14:48:08.960570 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_acd2b7ce-fe29-4b71-b730-7b1212f4416d/galera/0.log" Oct 06 14:48:09 crc kubenswrapper[4867]: I1006 14:48:09.229775 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ec109351-f578-4141-8193-44f6433880b3/mysql-bootstrap/0.log" Oct 06 14:48:09 crc kubenswrapper[4867]: I1006 14:48:09.465058 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ec109351-f578-4141-8193-44f6433880b3/galera/0.log" Oct 06 14:48:09 crc kubenswrapper[4867]: I1006 14:48:09.467548 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ec109351-f578-4141-8193-44f6433880b3/mysql-bootstrap/0.log" Oct 06 14:48:09 crc kubenswrapper[4867]: I1006 14:48:09.681566 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b7620829-b468-470c-899e-92faea8bc3c7/openstackclient/0.log" Oct 06 14:48:09 crc kubenswrapper[4867]: I1006 14:48:09.928895 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6nfld_cbe16793-d6a8-4aa9-b509-3f3b710b70e3/openstack-network-exporter/0.log" Oct 06 14:48:10 crc kubenswrapper[4867]: I1006 14:48:10.315480 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k22cm_7478d336-9573-432c-8d73-f7396d652085/ovsdb-server-init/0.log" Oct 06 14:48:10 crc kubenswrapper[4867]: I1006 14:48:10.592553 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k22cm_7478d336-9573-432c-8d73-f7396d652085/ovsdb-server-init/0.log" Oct 06 14:48:10 crc kubenswrapper[4867]: I1006 14:48:10.746921 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_299fc545-42f8-4889-8775-57b7aed64736/nova-metadata-metadata/0.log" Oct 06 14:48:10 crc kubenswrapper[4867]: I1006 14:48:10.815656 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k22cm_7478d336-9573-432c-8d73-f7396d652085/ovsdb-server/0.log" Oct 06 14:48:11 crc kubenswrapper[4867]: I1006 14:48:11.029237 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k22cm_7478d336-9573-432c-8d73-f7396d652085/ovs-vswitchd/0.log" Oct 06 14:48:11 crc kubenswrapper[4867]: I1006 14:48:11.058072 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_40e8af9c-90c3-4d15-b8c8-c7b35447bf17/memcached/0.log" Oct 06 14:48:11 crc kubenswrapper[4867]: I1006 14:48:11.069710 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-tg8j4_68750dd5-11c8-4fee-853c-09b68df5aff8/ovn-controller/0.log" Oct 06 14:48:11 crc kubenswrapper[4867]: I1006 14:48:11.270129 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7vftx_4960b423-de56-4b83-a577-f551c82c2702/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:11 crc kubenswrapper[4867]: I1006 14:48:11.278756 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad/openstack-network-exporter/0.log" Oct 06 14:48:11 crc kubenswrapper[4867]: I1006 14:48:11.429041 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c47c2b04-7fb7-4fb9-bbed-6e9b88cbadad/ovn-northd/0.log" Oct 06 14:48:11 crc kubenswrapper[4867]: I1006 14:48:11.485592 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b8d27ae1-8b6d-4a9d-b302-a354673be3be/openstack-network-exporter/0.log" Oct 06 14:48:11 crc kubenswrapper[4867]: I1006 14:48:11.504300 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b8d27ae1-8b6d-4a9d-b302-a354673be3be/ovsdbserver-nb/0.log" Oct 06 14:48:11 crc kubenswrapper[4867]: I1006 14:48:11.748666 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_18420b8b-345a-41e6-b753-6766143362a3/openstack-network-exporter/0.log" Oct 06 14:48:11 crc kubenswrapper[4867]: I1006 14:48:11.752501 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_18420b8b-345a-41e6-b753-6766143362a3/ovsdbserver-sb/0.log" Oct 06 14:48:12 crc kubenswrapper[4867]: I1006 14:48:12.100759 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-594954fbc6-c2fc2_7ddb2a04-2d3f-4340-a512-8921427ba510/placement-api/0.log" Oct 06 14:48:12 crc kubenswrapper[4867]: I1006 14:48:12.112177 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7d54862f-97ef-4958-8b56-4f6f590fc7da/init-config-reloader/0.log" Oct 06 14:48:12 crc kubenswrapper[4867]: I1006 14:48:12.172045 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-594954fbc6-c2fc2_7ddb2a04-2d3f-4340-a512-8921427ba510/placement-log/0.log" Oct 06 14:48:12 crc kubenswrapper[4867]: I1006 14:48:12.288654 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7d54862f-97ef-4958-8b56-4f6f590fc7da/init-config-reloader/0.log" Oct 06 14:48:12 crc kubenswrapper[4867]: I1006 14:48:12.315231 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7d54862f-97ef-4958-8b56-4f6f590fc7da/config-reloader/0.log" Oct 06 14:48:12 crc kubenswrapper[4867]: I1006 14:48:12.345832 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7d54862f-97ef-4958-8b56-4f6f590fc7da/prometheus/0.log" Oct 06 14:48:12 crc kubenswrapper[4867]: I1006 14:48:12.416366 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_7d54862f-97ef-4958-8b56-4f6f590fc7da/thanos-sidecar/0.log" Oct 06 14:48:12 crc kubenswrapper[4867]: I1006 14:48:12.509462 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6669c79a-e288-4d00-8add-bffd6b33b8b9/setup-container/0.log" Oct 06 14:48:12 crc kubenswrapper[4867]: I1006 14:48:12.731583 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6669c79a-e288-4d00-8add-bffd6b33b8b9/setup-container/0.log" Oct 06 14:48:12 crc kubenswrapper[4867]: I1006 14:48:12.774487 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6669c79a-e288-4d00-8add-bffd6b33b8b9/rabbitmq/0.log" Oct 06 14:48:12 crc kubenswrapper[4867]: I1006 14:48:12.831408 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_4beec03b-3d57-4c36-a149-153bb022bd7a/setup-container/0.log" Oct 06 14:48:12 crc kubenswrapper[4867]: I1006 14:48:12.959374 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_4beec03b-3d57-4c36-a149-153bb022bd7a/setup-container/0.log" Oct 06 14:48:13 crc kubenswrapper[4867]: I1006 14:48:13.006530 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_4beec03b-3d57-4c36-a149-153bb022bd7a/rabbitmq/0.log" Oct 06 14:48:13 crc kubenswrapper[4867]: I1006 14:48:13.030301 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5d0a4a4a-9d75-4d2b-aeb8-1903093398d0/setup-container/0.log" Oct 06 14:48:13 crc kubenswrapper[4867]: I1006 14:48:13.207068 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5d0a4a4a-9d75-4d2b-aeb8-1903093398d0/setup-container/0.log" Oct 06 14:48:13 crc kubenswrapper[4867]: I1006 14:48:13.287645 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5d0a4a4a-9d75-4d2b-aeb8-1903093398d0/rabbitmq/0.log" Oct 06 14:48:13 crc kubenswrapper[4867]: I1006 14:48:13.321868 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-jfn6w_5759403e-a3b6-4553-9e27-f471a616644f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:13 crc kubenswrapper[4867]: I1006 14:48:13.474797 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-p9xg6_05b64a8a-2fa5-4281-8e82-c27ff976b24f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:13 crc kubenswrapper[4867]: I1006 14:48:13.538774 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-2jnnw_8f86cabb-0582-4b1c-993f-f9766defe823/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:13 crc kubenswrapper[4867]: I1006 14:48:13.688468 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rbjrx_8dec95c2-2ac5-4886-b1e1-ab333d4f5907/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:13 crc kubenswrapper[4867]: I1006 14:48:13.726176 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-smg6g_4b12f715-2704-4545-a627-39426cb3de93/ssh-known-hosts-edpm-deployment/0.log" Oct 06 14:48:13 crc kubenswrapper[4867]: I1006 14:48:13.917283 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7b666bc78f-zvlqd_a06d3199-78ee-4389-bbd2-2bc53c012c84/proxy-server/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.097490 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7b666bc78f-zvlqd_a06d3199-78ee-4389-bbd2-2bc53c012c84/proxy-httpd/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.100544 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-p9sjp_46037a5a-6fcb-48c6-854d-1f4e60534120/swift-ring-rebalance/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.226655 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/account-auditor/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.304665 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/account-reaper/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.351225 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/account-replicator/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.370776 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/account-server/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.440091 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/container-auditor/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.550424 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/container-replicator/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.569371 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/container-server/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.574230 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/container-updater/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.701781 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/object-auditor/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.744705 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/object-expirer/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.786994 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/object-server/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.801287 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/object-replicator/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.916526 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/object-updater/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.990126 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/swift-recon-cron/0.log" Oct 06 14:48:14 crc kubenswrapper[4867]: I1006 14:48:14.993622 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_dc7edd17-2d19-4949-8849-9a62cd86e861/rsync/0.log" Oct 06 14:48:15 crc kubenswrapper[4867]: I1006 14:48:15.124720 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-9sv5t_a13c4977-6a03-4678-b394-0b33d74ee2a8/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:15 crc kubenswrapper[4867]: I1006 14:48:15.189265 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_43684055-87e6-4568-8a80-8019600aaeef/tempest-tests-tempest-tests-runner/0.log" Oct 06 14:48:15 crc kubenswrapper[4867]: I1006 14:48:15.372958 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9b73e25d-d1ba-4829-948f-bba412f56404/test-operator-logs-container/0.log" Oct 06 14:48:15 crc kubenswrapper[4867]: I1006 14:48:15.404432 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-b6d5k_4c45ecf4-2135-407f-ab03-6c1571cd3f76/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:48:16 crc kubenswrapper[4867]: I1006 14:48:16.361447 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_5389fa15-6fc4-4154-9760-38f0653cb802/watcher-applier/0.log" Oct 06 14:48:16 crc kubenswrapper[4867]: I1006 14:48:16.820156 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_288a6591-36fc-453e-b41f-c0bed1da11b6/watcher-api-log/0.log" Oct 06 14:48:19 crc kubenswrapper[4867]: I1006 14:48:19.223163 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:48:19 crc kubenswrapper[4867]: E1006 14:48:19.223768 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:48:19 crc kubenswrapper[4867]: I1006 14:48:19.481867 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_886a11ab-54f5-45c1-a604-41203d080360/watcher-decision-engine/0.log" Oct 06 14:48:20 crc kubenswrapper[4867]: I1006 14:48:20.617430 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_288a6591-36fc-453e-b41f-c0bed1da11b6/watcher-api/0.log" Oct 06 14:48:30 crc kubenswrapper[4867]: I1006 14:48:30.221904 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:48:30 crc kubenswrapper[4867]: E1006 14:48:30.222908 4867 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-shmxq_openshift-machine-config-operator(9f5dc284-392f-4e65-9f43-cb9ced2e47d3)\"" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" Oct 06 14:48:43 crc kubenswrapper[4867]: I1006 14:48:43.220982 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:48:43 crc kubenswrapper[4867]: I1006 14:48:43.759560 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"82af3ea6fa0b81208e9085b645b6be2009676535f0afc1fdf1ac686fc6759a4d"} Oct 06 14:48:43 crc kubenswrapper[4867]: I1006 14:48:43.761482 4867 generic.go:334] "Generic (PLEG): container finished" podID="cebf5a2c-4133-4007-8813-5d872f3ef665" containerID="e2d1be4b07dad6ac43aff080214467294bc18ee6cda63e9b7af43f4c4e50ab11" exitCode=0 Oct 06 14:48:43 crc kubenswrapper[4867]: I1006 14:48:43.761515 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq2cw/crc-debug-lz7lx" event={"ID":"cebf5a2c-4133-4007-8813-5d872f3ef665","Type":"ContainerDied","Data":"e2d1be4b07dad6ac43aff080214467294bc18ee6cda63e9b7af43f4c4e50ab11"} Oct 06 14:48:44 crc kubenswrapper[4867]: I1006 14:48:44.892118 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/crc-debug-lz7lx" Oct 06 14:48:44 crc kubenswrapper[4867]: I1006 14:48:44.933173 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fq2cw/crc-debug-lz7lx"] Oct 06 14:48:44 crc kubenswrapper[4867]: I1006 14:48:44.943021 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fq2cw/crc-debug-lz7lx"] Oct 06 14:48:44 crc kubenswrapper[4867]: I1006 14:48:44.958312 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsrvc\" (UniqueName: \"kubernetes.io/projected/cebf5a2c-4133-4007-8813-5d872f3ef665-kube-api-access-hsrvc\") pod \"cebf5a2c-4133-4007-8813-5d872f3ef665\" (UID: \"cebf5a2c-4133-4007-8813-5d872f3ef665\") " Oct 06 14:48:44 crc kubenswrapper[4867]: I1006 14:48:44.958387 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cebf5a2c-4133-4007-8813-5d872f3ef665-host\") pod \"cebf5a2c-4133-4007-8813-5d872f3ef665\" (UID: \"cebf5a2c-4133-4007-8813-5d872f3ef665\") " Oct 06 14:48:44 crc kubenswrapper[4867]: I1006 14:48:44.959431 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cebf5a2c-4133-4007-8813-5d872f3ef665-host" (OuterVolumeSpecName: "host") pod "cebf5a2c-4133-4007-8813-5d872f3ef665" (UID: "cebf5a2c-4133-4007-8813-5d872f3ef665"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 14:48:44 crc kubenswrapper[4867]: I1006 14:48:44.966448 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cebf5a2c-4133-4007-8813-5d872f3ef665-kube-api-access-hsrvc" (OuterVolumeSpecName: "kube-api-access-hsrvc") pod "cebf5a2c-4133-4007-8813-5d872f3ef665" (UID: "cebf5a2c-4133-4007-8813-5d872f3ef665"). InnerVolumeSpecName "kube-api-access-hsrvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:48:45 crc kubenswrapper[4867]: I1006 14:48:45.061219 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsrvc\" (UniqueName: \"kubernetes.io/projected/cebf5a2c-4133-4007-8813-5d872f3ef665-kube-api-access-hsrvc\") on node \"crc\" DevicePath \"\"" Oct 06 14:48:45 crc kubenswrapper[4867]: I1006 14:48:45.061278 4867 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cebf5a2c-4133-4007-8813-5d872f3ef665-host\") on node \"crc\" DevicePath \"\"" Oct 06 14:48:45 crc kubenswrapper[4867]: I1006 14:48:45.235177 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cebf5a2c-4133-4007-8813-5d872f3ef665" path="/var/lib/kubelet/pods/cebf5a2c-4133-4007-8813-5d872f3ef665/volumes" Oct 06 14:48:45 crc kubenswrapper[4867]: I1006 14:48:45.798344 4867 scope.go:117] "RemoveContainer" containerID="e2d1be4b07dad6ac43aff080214467294bc18ee6cda63e9b7af43f4c4e50ab11" Oct 06 14:48:45 crc kubenswrapper[4867]: I1006 14:48:45.798418 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/crc-debug-lz7lx" Oct 06 14:48:46 crc kubenswrapper[4867]: I1006 14:48:46.116205 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fq2cw/crc-debug-jr47s"] Oct 06 14:48:46 crc kubenswrapper[4867]: E1006 14:48:46.117114 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cebf5a2c-4133-4007-8813-5d872f3ef665" containerName="container-00" Oct 06 14:48:46 crc kubenswrapper[4867]: I1006 14:48:46.117139 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="cebf5a2c-4133-4007-8813-5d872f3ef665" containerName="container-00" Oct 06 14:48:46 crc kubenswrapper[4867]: I1006 14:48:46.117625 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="cebf5a2c-4133-4007-8813-5d872f3ef665" containerName="container-00" Oct 06 14:48:46 crc kubenswrapper[4867]: I1006 14:48:46.118969 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/crc-debug-jr47s" Oct 06 14:48:46 crc kubenswrapper[4867]: I1006 14:48:46.188148 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxc4d\" (UniqueName: \"kubernetes.io/projected/99e1d543-e821-4552-9c8f-62da7f963a7e-kube-api-access-fxc4d\") pod \"crc-debug-jr47s\" (UID: \"99e1d543-e821-4552-9c8f-62da7f963a7e\") " pod="openshift-must-gather-fq2cw/crc-debug-jr47s" Oct 06 14:48:46 crc kubenswrapper[4867]: I1006 14:48:46.188263 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99e1d543-e821-4552-9c8f-62da7f963a7e-host\") pod \"crc-debug-jr47s\" (UID: \"99e1d543-e821-4552-9c8f-62da7f963a7e\") " pod="openshift-must-gather-fq2cw/crc-debug-jr47s" Oct 06 14:48:46 crc kubenswrapper[4867]: I1006 14:48:46.291080 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxc4d\" (UniqueName: \"kubernetes.io/projected/99e1d543-e821-4552-9c8f-62da7f963a7e-kube-api-access-fxc4d\") pod \"crc-debug-jr47s\" (UID: \"99e1d543-e821-4552-9c8f-62da7f963a7e\") " pod="openshift-must-gather-fq2cw/crc-debug-jr47s" Oct 06 14:48:46 crc kubenswrapper[4867]: I1006 14:48:46.291219 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99e1d543-e821-4552-9c8f-62da7f963a7e-host\") pod \"crc-debug-jr47s\" (UID: \"99e1d543-e821-4552-9c8f-62da7f963a7e\") " pod="openshift-must-gather-fq2cw/crc-debug-jr47s" Oct 06 14:48:46 crc kubenswrapper[4867]: I1006 14:48:46.291428 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99e1d543-e821-4552-9c8f-62da7f963a7e-host\") pod \"crc-debug-jr47s\" (UID: \"99e1d543-e821-4552-9c8f-62da7f963a7e\") " pod="openshift-must-gather-fq2cw/crc-debug-jr47s" Oct 06 14:48:46 crc kubenswrapper[4867]: I1006 14:48:46.314461 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxc4d\" (UniqueName: \"kubernetes.io/projected/99e1d543-e821-4552-9c8f-62da7f963a7e-kube-api-access-fxc4d\") pod \"crc-debug-jr47s\" (UID: \"99e1d543-e821-4552-9c8f-62da7f963a7e\") " pod="openshift-must-gather-fq2cw/crc-debug-jr47s" Oct 06 14:48:46 crc kubenswrapper[4867]: I1006 14:48:46.441772 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/crc-debug-jr47s" Oct 06 14:48:46 crc kubenswrapper[4867]: W1006 14:48:46.481507 4867 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99e1d543_e821_4552_9c8f_62da7f963a7e.slice/crio-e485c6217f0256097b04180dfd441ca2226efa93d15ef86a5c5a781b78016963 WatchSource:0}: Error finding container e485c6217f0256097b04180dfd441ca2226efa93d15ef86a5c5a781b78016963: Status 404 returned error can't find the container with id e485c6217f0256097b04180dfd441ca2226efa93d15ef86a5c5a781b78016963 Oct 06 14:48:46 crc kubenswrapper[4867]: I1006 14:48:46.813883 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq2cw/crc-debug-jr47s" event={"ID":"99e1d543-e821-4552-9c8f-62da7f963a7e","Type":"ContainerStarted","Data":"d3469aeed11db27b3f0e507f7508479181b12b53625439001437baecbbdc6936"} Oct 06 14:48:46 crc kubenswrapper[4867]: I1006 14:48:46.813951 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq2cw/crc-debug-jr47s" event={"ID":"99e1d543-e821-4552-9c8f-62da7f963a7e","Type":"ContainerStarted","Data":"e485c6217f0256097b04180dfd441ca2226efa93d15ef86a5c5a781b78016963"} Oct 06 14:48:46 crc kubenswrapper[4867]: I1006 14:48:46.837352 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fq2cw/crc-debug-jr47s" podStartSLOduration=0.837194079 podStartE2EDuration="837.194079ms" podCreationTimestamp="2025-10-06 14:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:48:46.830353456 +0000 UTC m=+6306.288301600" watchObservedRunningTime="2025-10-06 14:48:46.837194079 +0000 UTC m=+6306.295142243" Oct 06 14:48:47 crc kubenswrapper[4867]: I1006 14:48:47.825323 4867 generic.go:334] "Generic (PLEG): container finished" podID="99e1d543-e821-4552-9c8f-62da7f963a7e" containerID="d3469aeed11db27b3f0e507f7508479181b12b53625439001437baecbbdc6936" exitCode=0 Oct 06 14:48:47 crc kubenswrapper[4867]: I1006 14:48:47.825389 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq2cw/crc-debug-jr47s" event={"ID":"99e1d543-e821-4552-9c8f-62da7f963a7e","Type":"ContainerDied","Data":"d3469aeed11db27b3f0e507f7508479181b12b53625439001437baecbbdc6936"} Oct 06 14:48:48 crc kubenswrapper[4867]: I1006 14:48:48.988448 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/crc-debug-jr47s" Oct 06 14:48:49 crc kubenswrapper[4867]: I1006 14:48:49.054395 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxc4d\" (UniqueName: \"kubernetes.io/projected/99e1d543-e821-4552-9c8f-62da7f963a7e-kube-api-access-fxc4d\") pod \"99e1d543-e821-4552-9c8f-62da7f963a7e\" (UID: \"99e1d543-e821-4552-9c8f-62da7f963a7e\") " Oct 06 14:48:49 crc kubenswrapper[4867]: I1006 14:48:49.054501 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99e1d543-e821-4552-9c8f-62da7f963a7e-host\") pod \"99e1d543-e821-4552-9c8f-62da7f963a7e\" (UID: \"99e1d543-e821-4552-9c8f-62da7f963a7e\") " Oct 06 14:48:49 crc kubenswrapper[4867]: I1006 14:48:49.054548 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99e1d543-e821-4552-9c8f-62da7f963a7e-host" (OuterVolumeSpecName: "host") pod "99e1d543-e821-4552-9c8f-62da7f963a7e" (UID: "99e1d543-e821-4552-9c8f-62da7f963a7e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 14:48:49 crc kubenswrapper[4867]: I1006 14:48:49.054985 4867 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99e1d543-e821-4552-9c8f-62da7f963a7e-host\") on node \"crc\" DevicePath \"\"" Oct 06 14:48:49 crc kubenswrapper[4867]: I1006 14:48:49.061945 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e1d543-e821-4552-9c8f-62da7f963a7e-kube-api-access-fxc4d" (OuterVolumeSpecName: "kube-api-access-fxc4d") pod "99e1d543-e821-4552-9c8f-62da7f963a7e" (UID: "99e1d543-e821-4552-9c8f-62da7f963a7e"). InnerVolumeSpecName "kube-api-access-fxc4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:48:49 crc kubenswrapper[4867]: I1006 14:48:49.157246 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxc4d\" (UniqueName: \"kubernetes.io/projected/99e1d543-e821-4552-9c8f-62da7f963a7e-kube-api-access-fxc4d\") on node \"crc\" DevicePath \"\"" Oct 06 14:48:49 crc kubenswrapper[4867]: I1006 14:48:49.850179 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq2cw/crc-debug-jr47s" event={"ID":"99e1d543-e821-4552-9c8f-62da7f963a7e","Type":"ContainerDied","Data":"e485c6217f0256097b04180dfd441ca2226efa93d15ef86a5c5a781b78016963"} Oct 06 14:48:49 crc kubenswrapper[4867]: I1006 14:48:49.850227 4867 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e485c6217f0256097b04180dfd441ca2226efa93d15ef86a5c5a781b78016963" Oct 06 14:48:49 crc kubenswrapper[4867]: I1006 14:48:49.851391 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/crc-debug-jr47s" Oct 06 14:48:57 crc kubenswrapper[4867]: I1006 14:48:57.418848 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fq2cw/crc-debug-jr47s"] Oct 06 14:48:57 crc kubenswrapper[4867]: I1006 14:48:57.428877 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fq2cw/crc-debug-jr47s"] Oct 06 14:48:58 crc kubenswrapper[4867]: I1006 14:48:58.576164 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fq2cw/crc-debug-2mhg5"] Oct 06 14:48:58 crc kubenswrapper[4867]: E1006 14:48:58.577755 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e1d543-e821-4552-9c8f-62da7f963a7e" containerName="container-00" Oct 06 14:48:58 crc kubenswrapper[4867]: I1006 14:48:58.577870 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e1d543-e821-4552-9c8f-62da7f963a7e" containerName="container-00" Oct 06 14:48:58 crc kubenswrapper[4867]: I1006 14:48:58.578280 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e1d543-e821-4552-9c8f-62da7f963a7e" containerName="container-00" Oct 06 14:48:58 crc kubenswrapper[4867]: I1006 14:48:58.579614 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/crc-debug-2mhg5" Oct 06 14:48:58 crc kubenswrapper[4867]: I1006 14:48:58.659684 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cb6p\" (UniqueName: \"kubernetes.io/projected/1f6ecc2b-e79f-47a7-8339-b3ef485810e9-kube-api-access-5cb6p\") pod \"crc-debug-2mhg5\" (UID: \"1f6ecc2b-e79f-47a7-8339-b3ef485810e9\") " pod="openshift-must-gather-fq2cw/crc-debug-2mhg5" Oct 06 14:48:58 crc kubenswrapper[4867]: I1006 14:48:58.659761 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f6ecc2b-e79f-47a7-8339-b3ef485810e9-host\") pod \"crc-debug-2mhg5\" (UID: \"1f6ecc2b-e79f-47a7-8339-b3ef485810e9\") " pod="openshift-must-gather-fq2cw/crc-debug-2mhg5" Oct 06 14:48:58 crc kubenswrapper[4867]: I1006 14:48:58.761785 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cb6p\" (UniqueName: \"kubernetes.io/projected/1f6ecc2b-e79f-47a7-8339-b3ef485810e9-kube-api-access-5cb6p\") pod \"crc-debug-2mhg5\" (UID: \"1f6ecc2b-e79f-47a7-8339-b3ef485810e9\") " pod="openshift-must-gather-fq2cw/crc-debug-2mhg5" Oct 06 14:48:58 crc kubenswrapper[4867]: I1006 14:48:58.762271 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f6ecc2b-e79f-47a7-8339-b3ef485810e9-host\") pod \"crc-debug-2mhg5\" (UID: \"1f6ecc2b-e79f-47a7-8339-b3ef485810e9\") " pod="openshift-must-gather-fq2cw/crc-debug-2mhg5" Oct 06 14:48:58 crc kubenswrapper[4867]: I1006 14:48:58.762382 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f6ecc2b-e79f-47a7-8339-b3ef485810e9-host\") pod \"crc-debug-2mhg5\" (UID: \"1f6ecc2b-e79f-47a7-8339-b3ef485810e9\") " pod="openshift-must-gather-fq2cw/crc-debug-2mhg5" Oct 06 14:48:58 crc kubenswrapper[4867]: I1006 14:48:58.780109 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cb6p\" (UniqueName: \"kubernetes.io/projected/1f6ecc2b-e79f-47a7-8339-b3ef485810e9-kube-api-access-5cb6p\") pod \"crc-debug-2mhg5\" (UID: \"1f6ecc2b-e79f-47a7-8339-b3ef485810e9\") " pod="openshift-must-gather-fq2cw/crc-debug-2mhg5" Oct 06 14:48:58 crc kubenswrapper[4867]: I1006 14:48:58.905636 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/crc-debug-2mhg5" Oct 06 14:48:58 crc kubenswrapper[4867]: I1006 14:48:58.964930 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq2cw/crc-debug-2mhg5" event={"ID":"1f6ecc2b-e79f-47a7-8339-b3ef485810e9","Type":"ContainerStarted","Data":"843279138ea985d0f52063228833c7c193d2256359b96505c6e187fc406943fa"} Oct 06 14:48:59 crc kubenswrapper[4867]: I1006 14:48:59.234381 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e1d543-e821-4552-9c8f-62da7f963a7e" path="/var/lib/kubelet/pods/99e1d543-e821-4552-9c8f-62da7f963a7e/volumes" Oct 06 14:48:59 crc kubenswrapper[4867]: I1006 14:48:59.976798 4867 generic.go:334] "Generic (PLEG): container finished" podID="1f6ecc2b-e79f-47a7-8339-b3ef485810e9" containerID="278a2f112dda982dfc2f88f8e6377f4e6624ef33834cc9be51609c4911125d9f" exitCode=0 Oct 06 14:48:59 crc kubenswrapper[4867]: I1006 14:48:59.976852 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq2cw/crc-debug-2mhg5" event={"ID":"1f6ecc2b-e79f-47a7-8339-b3ef485810e9","Type":"ContainerDied","Data":"278a2f112dda982dfc2f88f8e6377f4e6624ef33834cc9be51609c4911125d9f"} Oct 06 14:49:00 crc kubenswrapper[4867]: I1006 14:49:00.022523 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fq2cw/crc-debug-2mhg5"] Oct 06 14:49:00 crc kubenswrapper[4867]: I1006 14:49:00.034832 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fq2cw/crc-debug-2mhg5"] Oct 06 14:49:01 crc kubenswrapper[4867]: I1006 14:49:01.105189 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/crc-debug-2mhg5" Oct 06 14:49:01 crc kubenswrapper[4867]: I1006 14:49:01.209307 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f6ecc2b-e79f-47a7-8339-b3ef485810e9-host\") pod \"1f6ecc2b-e79f-47a7-8339-b3ef485810e9\" (UID: \"1f6ecc2b-e79f-47a7-8339-b3ef485810e9\") " Oct 06 14:49:01 crc kubenswrapper[4867]: I1006 14:49:01.209458 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f6ecc2b-e79f-47a7-8339-b3ef485810e9-host" (OuterVolumeSpecName: "host") pod "1f6ecc2b-e79f-47a7-8339-b3ef485810e9" (UID: "1f6ecc2b-e79f-47a7-8339-b3ef485810e9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 14:49:01 crc kubenswrapper[4867]: I1006 14:49:01.209829 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cb6p\" (UniqueName: \"kubernetes.io/projected/1f6ecc2b-e79f-47a7-8339-b3ef485810e9-kube-api-access-5cb6p\") pod \"1f6ecc2b-e79f-47a7-8339-b3ef485810e9\" (UID: \"1f6ecc2b-e79f-47a7-8339-b3ef485810e9\") " Oct 06 14:49:01 crc kubenswrapper[4867]: I1006 14:49:01.210393 4867 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f6ecc2b-e79f-47a7-8339-b3ef485810e9-host\") on node \"crc\" DevicePath \"\"" Oct 06 14:49:01 crc kubenswrapper[4867]: I1006 14:49:01.226955 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f6ecc2b-e79f-47a7-8339-b3ef485810e9-kube-api-access-5cb6p" (OuterVolumeSpecName: "kube-api-access-5cb6p") pod "1f6ecc2b-e79f-47a7-8339-b3ef485810e9" (UID: "1f6ecc2b-e79f-47a7-8339-b3ef485810e9"). InnerVolumeSpecName "kube-api-access-5cb6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:49:01 crc kubenswrapper[4867]: I1006 14:49:01.244149 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f6ecc2b-e79f-47a7-8339-b3ef485810e9" path="/var/lib/kubelet/pods/1f6ecc2b-e79f-47a7-8339-b3ef485810e9/volumes" Oct 06 14:49:01 crc kubenswrapper[4867]: I1006 14:49:01.312588 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cb6p\" (UniqueName: \"kubernetes.io/projected/1f6ecc2b-e79f-47a7-8339-b3ef485810e9-kube-api-access-5cb6p\") on node \"crc\" DevicePath \"\"" Oct 06 14:49:01 crc kubenswrapper[4867]: I1006 14:49:01.817438 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b_b09c33a7-5fbd-4594-bebf-bcb9a5519e89/util/0.log" Oct 06 14:49:01 crc kubenswrapper[4867]: I1006 14:49:01.921911 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b_b09c33a7-5fbd-4594-bebf-bcb9a5519e89/pull/0.log" Oct 06 14:49:01 crc kubenswrapper[4867]: I1006 14:49:01.975749 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b_b09c33a7-5fbd-4594-bebf-bcb9a5519e89/util/0.log" Oct 06 14:49:01 crc kubenswrapper[4867]: I1006 14:49:01.996710 4867 scope.go:117] "RemoveContainer" containerID="278a2f112dda982dfc2f88f8e6377f4e6624ef33834cc9be51609c4911125d9f" Oct 06 14:49:01 crc kubenswrapper[4867]: I1006 14:49:01.996752 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/crc-debug-2mhg5" Oct 06 14:49:02 crc kubenswrapper[4867]: I1006 14:49:02.013071 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b_b09c33a7-5fbd-4594-bebf-bcb9a5519e89/pull/0.log" Oct 06 14:49:02 crc kubenswrapper[4867]: I1006 14:49:02.154578 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b_b09c33a7-5fbd-4594-bebf-bcb9a5519e89/extract/0.log" Oct 06 14:49:02 crc kubenswrapper[4867]: I1006 14:49:02.156012 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b_b09c33a7-5fbd-4594-bebf-bcb9a5519e89/pull/0.log" Oct 06 14:49:02 crc kubenswrapper[4867]: I1006 14:49:02.167329 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2eb12252d80c5dfff991a9209c4a0732a9b1a61e5e516e96f79828ee3ex6n8b_b09c33a7-5fbd-4594-bebf-bcb9a5519e89/util/0.log" Oct 06 14:49:02 crc kubenswrapper[4867]: I1006 14:49:02.309531 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-4tgfn_c0e4bb9c-5d2f-4a0f-9cb8-9133336c2536/kube-rbac-proxy/0.log" Oct 06 14:49:02 crc kubenswrapper[4867]: I1006 14:49:02.392941 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-4tgfn_c0e4bb9c-5d2f-4a0f-9cb8-9133336c2536/manager/0.log" Oct 06 14:49:02 crc kubenswrapper[4867]: I1006 14:49:02.426498 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-v72nf_3c3a38a7-d3a0-4c01-aae9-645d5dada80f/kube-rbac-proxy/0.log" Oct 06 14:49:02 crc kubenswrapper[4867]: I1006 14:49:02.549483 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-v72nf_3c3a38a7-d3a0-4c01-aae9-645d5dada80f/manager/0.log" Oct 06 14:49:02 crc kubenswrapper[4867]: I1006 14:49:02.577496 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-s4qrw_3b46e0ea-7a30-45ab-99cc-d36efd3fc75e/kube-rbac-proxy/0.log" Oct 06 14:49:02 crc kubenswrapper[4867]: I1006 14:49:02.586164 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-s4qrw_3b46e0ea-7a30-45ab-99cc-d36efd3fc75e/manager/0.log" Oct 06 14:49:02 crc kubenswrapper[4867]: I1006 14:49:02.748405 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-84drf_7050df56-39f0-4962-878b-7e9c498d86d4/kube-rbac-proxy/0.log" Oct 06 14:49:02 crc kubenswrapper[4867]: I1006 14:49:02.835496 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-84drf_7050df56-39f0-4962-878b-7e9c498d86d4/manager/0.log" Oct 06 14:49:02 crc kubenswrapper[4867]: I1006 14:49:02.914723 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-jxt5n_5d14ff34-79c1-467d-99b0-35202d1650bb/kube-rbac-proxy/0.log" Oct 06 14:49:02 crc kubenswrapper[4867]: I1006 14:49:02.942447 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-jxt5n_5d14ff34-79c1-467d-99b0-35202d1650bb/manager/0.log" Oct 06 14:49:03 crc kubenswrapper[4867]: I1006 14:49:03.065236 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-v222m_901a13c6-49ea-4126-8b2d-7c7901720f05/kube-rbac-proxy/0.log" Oct 06 14:49:03 crc kubenswrapper[4867]: I1006 14:49:03.111230 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-v222m_901a13c6-49ea-4126-8b2d-7c7901720f05/manager/0.log" Oct 06 14:49:03 crc kubenswrapper[4867]: I1006 14:49:03.194509 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-n64zf_311ba4cb-158b-41f4-ada4-4fed1c0f2ede/kube-rbac-proxy/0.log" Oct 06 14:49:03 crc kubenswrapper[4867]: I1006 14:49:03.358782 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-qhnpp_92cf840d-e92d-4212-8d63-2d623040ca46/kube-rbac-proxy/0.log" Oct 06 14:49:03 crc kubenswrapper[4867]: I1006 14:49:03.420955 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-qhnpp_92cf840d-e92d-4212-8d63-2d623040ca46/manager/0.log" Oct 06 14:49:03 crc kubenswrapper[4867]: I1006 14:49:03.479560 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-n64zf_311ba4cb-158b-41f4-ada4-4fed1c0f2ede/manager/0.log" Oct 06 14:49:03 crc kubenswrapper[4867]: I1006 14:49:03.557078 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-fwbwb_21147e7d-1dd6-4a90-ab7a-f923f014a281/kube-rbac-proxy/0.log" Oct 06 14:49:03 crc kubenswrapper[4867]: I1006 14:49:03.663080 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-fwbwb_21147e7d-1dd6-4a90-ab7a-f923f014a281/manager/0.log" Oct 06 14:49:03 crc kubenswrapper[4867]: I1006 14:49:03.754532 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-njdr6_831b6b11-3e22-4ae1-aa26-1ccb9a6bacb7/manager/0.log" Oct 06 14:49:03 crc kubenswrapper[4867]: I1006 14:49:03.769391 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-njdr6_831b6b11-3e22-4ae1-aa26-1ccb9a6bacb7/kube-rbac-proxy/0.log" Oct 06 14:49:03 crc kubenswrapper[4867]: I1006 14:49:03.910976 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg_6c53454b-e984-4366-8bd1-3c4eb10fb1c8/kube-rbac-proxy/0.log" Oct 06 14:49:03 crc kubenswrapper[4867]: I1006 14:49:03.959927 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-xdfzg_6c53454b-e984-4366-8bd1-3c4eb10fb1c8/manager/0.log" Oct 06 14:49:04 crc kubenswrapper[4867]: I1006 14:49:04.060014 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-7jwqc_3d2faf90-2410-459e-a8a3-668296923f2e/kube-rbac-proxy/0.log" Oct 06 14:49:04 crc kubenswrapper[4867]: I1006 14:49:04.135593 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-7jwqc_3d2faf90-2410-459e-a8a3-668296923f2e/manager/0.log" Oct 06 14:49:04 crc kubenswrapper[4867]: I1006 14:49:04.182776 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-kpx9k_15792c9d-8f60-4b13-8623-55c9a6a7319b/kube-rbac-proxy/0.log" Oct 06 14:49:04 crc kubenswrapper[4867]: I1006 14:49:04.336734 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-kpx9k_15792c9d-8f60-4b13-8623-55c9a6a7319b/manager/0.log" Oct 06 14:49:04 crc kubenswrapper[4867]: I1006 14:49:04.343307 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-v6b9l_95e501d6-fddf-4baa-befd-25c5c5f3303e/kube-rbac-proxy/0.log" Oct 06 14:49:04 crc kubenswrapper[4867]: I1006 14:49:04.407492 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-v6b9l_95e501d6-fddf-4baa-befd-25c5c5f3303e/manager/0.log" Oct 06 14:49:04 crc kubenswrapper[4867]: I1006 14:49:04.542975 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt_dbe49bb4-18db-473e-b57c-2047bbbe2405/kube-rbac-proxy/0.log" Oct 06 14:49:04 crc kubenswrapper[4867]: I1006 14:49:04.562913 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665cvlpxt_dbe49bb4-18db-473e-b57c-2047bbbe2405/manager/0.log" Oct 06 14:49:04 crc kubenswrapper[4867]: I1006 14:49:04.775152 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-66dbf6f685-4srz5_94790623-543f-45ee-9579-6e837ce82cd8/kube-rbac-proxy/0.log" Oct 06 14:49:04 crc kubenswrapper[4867]: I1006 14:49:04.850473 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6c5b974dc6-zhns8_2ed7554d-3165-42b7-b7ac-6ad1b620e825/kube-rbac-proxy/0.log" Oct 06 14:49:05 crc kubenswrapper[4867]: I1006 14:49:05.062910 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7ptcs_160e7b7c-4f2e-4dba-99a5-35c4d3d9868d/registry-server/0.log" Oct 06 14:49:05 crc kubenswrapper[4867]: I1006 14:49:05.128397 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6c5b974dc6-zhns8_2ed7554d-3165-42b7-b7ac-6ad1b620e825/operator/0.log" Oct 06 14:49:05 crc kubenswrapper[4867]: I1006 14:49:05.326390 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-4lsh6_efcff7d5-4481-45ea-b693-ebc63e9f1458/kube-rbac-proxy/0.log" Oct 06 14:49:05 crc kubenswrapper[4867]: I1006 14:49:05.406955 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-4lsh6_efcff7d5-4481-45ea-b693-ebc63e9f1458/manager/0.log" Oct 06 14:49:05 crc kubenswrapper[4867]: I1006 14:49:05.419455 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-922cb_05580142-d01c-470a-afcb-da956c1f6d36/kube-rbac-proxy/0.log" Oct 06 14:49:05 crc kubenswrapper[4867]: I1006 14:49:05.569759 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-922cb_05580142-d01c-470a-afcb-da956c1f6d36/manager/0.log" Oct 06 14:49:05 crc kubenswrapper[4867]: I1006 14:49:05.609617 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-4x2bp_bdb84464-dbf1-4dfc-9a87-b3dde7d0fcc3/operator/0.log" Oct 06 14:49:05 crc kubenswrapper[4867]: I1006 14:49:05.806711 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-48lgc_937696b7-f234-4e2e-97b3-9ef0f2bf0a90/kube-rbac-proxy/0.log" Oct 06 14:49:05 crc kubenswrapper[4867]: I1006 14:49:05.886596 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-wrrdj_d7b781c6-8500-43b4-884d-e67aadad8518/kube-rbac-proxy/0.log" Oct 06 14:49:05 crc kubenswrapper[4867]: I1006 14:49:05.934942 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-48lgc_937696b7-f234-4e2e-97b3-9ef0f2bf0a90/manager/0.log" Oct 06 14:49:06 crc kubenswrapper[4867]: I1006 14:49:06.070407 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-66dbf6f685-4srz5_94790623-543f-45ee-9579-6e837ce82cd8/manager/0.log" Oct 06 14:49:06 crc kubenswrapper[4867]: I1006 14:49:06.075340 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-tvpx8_9d369f1b-62ae-4b24-8287-fd62b21122ce/kube-rbac-proxy/0.log" Oct 06 14:49:06 crc kubenswrapper[4867]: I1006 14:49:06.143981 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-tvpx8_9d369f1b-62ae-4b24-8287-fd62b21122ce/manager/0.log" Oct 06 14:49:06 crc kubenswrapper[4867]: I1006 14:49:06.293847 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-55dcdc7cc-z7lp5_b099322d-539c-4c48-9344-62e1fec437ab/kube-rbac-proxy/0.log" Oct 06 14:49:06 crc kubenswrapper[4867]: I1006 14:49:06.373285 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-wrrdj_d7b781c6-8500-43b4-884d-e67aadad8518/manager/0.log" Oct 06 14:49:06 crc kubenswrapper[4867]: I1006 14:49:06.461388 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-55dcdc7cc-z7lp5_b099322d-539c-4c48-9344-62e1fec437ab/manager/0.log" Oct 06 14:49:20 crc kubenswrapper[4867]: I1006 14:49:20.298130 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dclxp_1204892b-a86d-4b14-9aca-1fcbd64c9cd2/control-plane-machine-set-operator/0.log" Oct 06 14:49:20 crc kubenswrapper[4867]: I1006 14:49:20.472224 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8lg7n_ed7648e1-d992-4263-9117-e50cd88a66a9/kube-rbac-proxy/0.log" Oct 06 14:49:20 crc kubenswrapper[4867]: I1006 14:49:20.487570 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8lg7n_ed7648e1-d992-4263-9117-e50cd88a66a9/machine-api-operator/0.log" Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.637925 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-98j7c"] Oct 06 14:49:21 crc kubenswrapper[4867]: E1006 14:49:21.638794 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f6ecc2b-e79f-47a7-8339-b3ef485810e9" containerName="container-00" Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.638807 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f6ecc2b-e79f-47a7-8339-b3ef485810e9" containerName="container-00" Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.639042 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f6ecc2b-e79f-47a7-8339-b3ef485810e9" containerName="container-00" Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.640584 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.668558 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-98j7c"] Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.728951 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3580e79c-5f3f-406a-a4bb-27c9bb728779-utilities\") pod \"community-operators-98j7c\" (UID: \"3580e79c-5f3f-406a-a4bb-27c9bb728779\") " pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.729012 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3580e79c-5f3f-406a-a4bb-27c9bb728779-catalog-content\") pod \"community-operators-98j7c\" (UID: \"3580e79c-5f3f-406a-a4bb-27c9bb728779\") " pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.729164 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tdzq\" (UniqueName: \"kubernetes.io/projected/3580e79c-5f3f-406a-a4bb-27c9bb728779-kube-api-access-8tdzq\") pod \"community-operators-98j7c\" (UID: \"3580e79c-5f3f-406a-a4bb-27c9bb728779\") " pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.830599 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tdzq\" (UniqueName: \"kubernetes.io/projected/3580e79c-5f3f-406a-a4bb-27c9bb728779-kube-api-access-8tdzq\") pod \"community-operators-98j7c\" (UID: \"3580e79c-5f3f-406a-a4bb-27c9bb728779\") " pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.830747 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3580e79c-5f3f-406a-a4bb-27c9bb728779-utilities\") pod \"community-operators-98j7c\" (UID: \"3580e79c-5f3f-406a-a4bb-27c9bb728779\") " pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.830775 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3580e79c-5f3f-406a-a4bb-27c9bb728779-catalog-content\") pod \"community-operators-98j7c\" (UID: \"3580e79c-5f3f-406a-a4bb-27c9bb728779\") " pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.831278 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3580e79c-5f3f-406a-a4bb-27c9bb728779-catalog-content\") pod \"community-operators-98j7c\" (UID: \"3580e79c-5f3f-406a-a4bb-27c9bb728779\") " pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.831370 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3580e79c-5f3f-406a-a4bb-27c9bb728779-utilities\") pod \"community-operators-98j7c\" (UID: \"3580e79c-5f3f-406a-a4bb-27c9bb728779\") " pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.850214 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tdzq\" (UniqueName: \"kubernetes.io/projected/3580e79c-5f3f-406a-a4bb-27c9bb728779-kube-api-access-8tdzq\") pod \"community-operators-98j7c\" (UID: \"3580e79c-5f3f-406a-a4bb-27c9bb728779\") " pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:21 crc kubenswrapper[4867]: I1006 14:49:21.978088 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:22 crc kubenswrapper[4867]: I1006 14:49:22.514715 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-98j7c"] Oct 06 14:49:23 crc kubenswrapper[4867]: I1006 14:49:23.215725 4867 generic.go:334] "Generic (PLEG): container finished" podID="3580e79c-5f3f-406a-a4bb-27c9bb728779" containerID="277361776a4d6574bc99398c754e02432bb6b54cff9fa065035eeadaa8f16a86" exitCode=0 Oct 06 14:49:23 crc kubenswrapper[4867]: I1006 14:49:23.215763 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98j7c" event={"ID":"3580e79c-5f3f-406a-a4bb-27c9bb728779","Type":"ContainerDied","Data":"277361776a4d6574bc99398c754e02432bb6b54cff9fa065035eeadaa8f16a86"} Oct 06 14:49:23 crc kubenswrapper[4867]: I1006 14:49:23.216106 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98j7c" event={"ID":"3580e79c-5f3f-406a-a4bb-27c9bb728779","Type":"ContainerStarted","Data":"b3b7944d62eda5fc47a494f7e077ef6b25377451d8ef6e289b085d29f503bfcf"} Oct 06 14:49:24 crc kubenswrapper[4867]: I1006 14:49:24.229068 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98j7c" event={"ID":"3580e79c-5f3f-406a-a4bb-27c9bb728779","Type":"ContainerStarted","Data":"83a9b45705a8823da1d96ee79fcf2cb0ddb88ed0a87296c5ba4b988d81e1baa9"} Oct 06 14:49:25 crc kubenswrapper[4867]: I1006 14:49:25.239830 4867 generic.go:334] "Generic (PLEG): container finished" podID="3580e79c-5f3f-406a-a4bb-27c9bb728779" containerID="83a9b45705a8823da1d96ee79fcf2cb0ddb88ed0a87296c5ba4b988d81e1baa9" exitCode=0 Oct 06 14:49:25 crc kubenswrapper[4867]: I1006 14:49:25.239910 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98j7c" event={"ID":"3580e79c-5f3f-406a-a4bb-27c9bb728779","Type":"ContainerDied","Data":"83a9b45705a8823da1d96ee79fcf2cb0ddb88ed0a87296c5ba4b988d81e1baa9"} Oct 06 14:49:26 crc kubenswrapper[4867]: I1006 14:49:26.257483 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98j7c" event={"ID":"3580e79c-5f3f-406a-a4bb-27c9bb728779","Type":"ContainerStarted","Data":"95001fb08dd6acaac9232283b68cbd652e01702575e2c40dd461d398ad295f12"} Oct 06 14:49:26 crc kubenswrapper[4867]: I1006 14:49:26.287552 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-98j7c" podStartSLOduration=2.831363943 podStartE2EDuration="5.287535189s" podCreationTimestamp="2025-10-06 14:49:21 +0000 UTC" firstStartedPulling="2025-10-06 14:49:23.218696819 +0000 UTC m=+6342.676644963" lastFinishedPulling="2025-10-06 14:49:25.674868065 +0000 UTC m=+6345.132816209" observedRunningTime="2025-10-06 14:49:26.281334343 +0000 UTC m=+6345.739282487" watchObservedRunningTime="2025-10-06 14:49:26.287535189 +0000 UTC m=+6345.745483333" Oct 06 14:49:31 crc kubenswrapper[4867]: I1006 14:49:31.954876 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-tlbns_ed16aef2-69b2-443d-8d5d-c2122dd5b373/cert-manager-controller/0.log" Oct 06 14:49:31 crc kubenswrapper[4867]: I1006 14:49:31.978667 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:31 crc kubenswrapper[4867]: I1006 14:49:31.978727 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:32 crc kubenswrapper[4867]: I1006 14:49:32.033929 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:32 crc kubenswrapper[4867]: I1006 14:49:32.195734 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-rxv42_e5b18647-65b8-4ed4-bf88-542c6c583588/cert-manager-webhook/0.log" Oct 06 14:49:32 crc kubenswrapper[4867]: I1006 14:49:32.209074 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-h95bg_29b819dc-d3f7-449d-812a-9a76c1d02046/cert-manager-cainjector/0.log" Oct 06 14:49:32 crc kubenswrapper[4867]: I1006 14:49:32.360951 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:32 crc kubenswrapper[4867]: I1006 14:49:32.407305 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-98j7c"] Oct 06 14:49:34 crc kubenswrapper[4867]: I1006 14:49:34.333050 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-98j7c" podUID="3580e79c-5f3f-406a-a4bb-27c9bb728779" containerName="registry-server" containerID="cri-o://95001fb08dd6acaac9232283b68cbd652e01702575e2c40dd461d398ad295f12" gracePeriod=2 Oct 06 14:49:34 crc kubenswrapper[4867]: I1006 14:49:34.835286 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:34 crc kubenswrapper[4867]: I1006 14:49:34.991042 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3580e79c-5f3f-406a-a4bb-27c9bb728779-catalog-content\") pod \"3580e79c-5f3f-406a-a4bb-27c9bb728779\" (UID: \"3580e79c-5f3f-406a-a4bb-27c9bb728779\") " Oct 06 14:49:34 crc kubenswrapper[4867]: I1006 14:49:34.991163 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdzq\" (UniqueName: \"kubernetes.io/projected/3580e79c-5f3f-406a-a4bb-27c9bb728779-kube-api-access-8tdzq\") pod \"3580e79c-5f3f-406a-a4bb-27c9bb728779\" (UID: \"3580e79c-5f3f-406a-a4bb-27c9bb728779\") " Oct 06 14:49:34 crc kubenswrapper[4867]: I1006 14:49:34.991194 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3580e79c-5f3f-406a-a4bb-27c9bb728779-utilities\") pod \"3580e79c-5f3f-406a-a4bb-27c9bb728779\" (UID: \"3580e79c-5f3f-406a-a4bb-27c9bb728779\") " Oct 06 14:49:34 crc kubenswrapper[4867]: I1006 14:49:34.992374 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3580e79c-5f3f-406a-a4bb-27c9bb728779-utilities" (OuterVolumeSpecName: "utilities") pod "3580e79c-5f3f-406a-a4bb-27c9bb728779" (UID: "3580e79c-5f3f-406a-a4bb-27c9bb728779"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.001602 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3580e79c-5f3f-406a-a4bb-27c9bb728779-kube-api-access-8tdzq" (OuterVolumeSpecName: "kube-api-access-8tdzq") pod "3580e79c-5f3f-406a-a4bb-27c9bb728779" (UID: "3580e79c-5f3f-406a-a4bb-27c9bb728779"). InnerVolumeSpecName "kube-api-access-8tdzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.042398 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3580e79c-5f3f-406a-a4bb-27c9bb728779-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3580e79c-5f3f-406a-a4bb-27c9bb728779" (UID: "3580e79c-5f3f-406a-a4bb-27c9bb728779"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.093958 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3580e79c-5f3f-406a-a4bb-27c9bb728779-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.093994 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdzq\" (UniqueName: \"kubernetes.io/projected/3580e79c-5f3f-406a-a4bb-27c9bb728779-kube-api-access-8tdzq\") on node \"crc\" DevicePath \"\"" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.094004 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3580e79c-5f3f-406a-a4bb-27c9bb728779-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.345010 4867 generic.go:334] "Generic (PLEG): container finished" podID="3580e79c-5f3f-406a-a4bb-27c9bb728779" containerID="95001fb08dd6acaac9232283b68cbd652e01702575e2c40dd461d398ad295f12" exitCode=0 Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.345065 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98j7c" event={"ID":"3580e79c-5f3f-406a-a4bb-27c9bb728779","Type":"ContainerDied","Data":"95001fb08dd6acaac9232283b68cbd652e01702575e2c40dd461d398ad295f12"} Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.345086 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98j7c" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.345108 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98j7c" event={"ID":"3580e79c-5f3f-406a-a4bb-27c9bb728779","Type":"ContainerDied","Data":"b3b7944d62eda5fc47a494f7e077ef6b25377451d8ef6e289b085d29f503bfcf"} Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.345143 4867 scope.go:117] "RemoveContainer" containerID="95001fb08dd6acaac9232283b68cbd652e01702575e2c40dd461d398ad295f12" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.369828 4867 scope.go:117] "RemoveContainer" containerID="83a9b45705a8823da1d96ee79fcf2cb0ddb88ed0a87296c5ba4b988d81e1baa9" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.369977 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-98j7c"] Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.379476 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-98j7c"] Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.394290 4867 scope.go:117] "RemoveContainer" containerID="277361776a4d6574bc99398c754e02432bb6b54cff9fa065035eeadaa8f16a86" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.456829 4867 scope.go:117] "RemoveContainer" containerID="95001fb08dd6acaac9232283b68cbd652e01702575e2c40dd461d398ad295f12" Oct 06 14:49:35 crc kubenswrapper[4867]: E1006 14:49:35.457478 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95001fb08dd6acaac9232283b68cbd652e01702575e2c40dd461d398ad295f12\": container with ID starting with 95001fb08dd6acaac9232283b68cbd652e01702575e2c40dd461d398ad295f12 not found: ID does not exist" containerID="95001fb08dd6acaac9232283b68cbd652e01702575e2c40dd461d398ad295f12" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.457506 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95001fb08dd6acaac9232283b68cbd652e01702575e2c40dd461d398ad295f12"} err="failed to get container status \"95001fb08dd6acaac9232283b68cbd652e01702575e2c40dd461d398ad295f12\": rpc error: code = NotFound desc = could not find container \"95001fb08dd6acaac9232283b68cbd652e01702575e2c40dd461d398ad295f12\": container with ID starting with 95001fb08dd6acaac9232283b68cbd652e01702575e2c40dd461d398ad295f12 not found: ID does not exist" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.457544 4867 scope.go:117] "RemoveContainer" containerID="83a9b45705a8823da1d96ee79fcf2cb0ddb88ed0a87296c5ba4b988d81e1baa9" Oct 06 14:49:35 crc kubenswrapper[4867]: E1006 14:49:35.458060 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83a9b45705a8823da1d96ee79fcf2cb0ddb88ed0a87296c5ba4b988d81e1baa9\": container with ID starting with 83a9b45705a8823da1d96ee79fcf2cb0ddb88ed0a87296c5ba4b988d81e1baa9 not found: ID does not exist" containerID="83a9b45705a8823da1d96ee79fcf2cb0ddb88ed0a87296c5ba4b988d81e1baa9" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.458111 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a9b45705a8823da1d96ee79fcf2cb0ddb88ed0a87296c5ba4b988d81e1baa9"} err="failed to get container status \"83a9b45705a8823da1d96ee79fcf2cb0ddb88ed0a87296c5ba4b988d81e1baa9\": rpc error: code = NotFound desc = could not find container \"83a9b45705a8823da1d96ee79fcf2cb0ddb88ed0a87296c5ba4b988d81e1baa9\": container with ID starting with 83a9b45705a8823da1d96ee79fcf2cb0ddb88ed0a87296c5ba4b988d81e1baa9 not found: ID does not exist" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.458133 4867 scope.go:117] "RemoveContainer" containerID="277361776a4d6574bc99398c754e02432bb6b54cff9fa065035eeadaa8f16a86" Oct 06 14:49:35 crc kubenswrapper[4867]: E1006 14:49:35.458620 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"277361776a4d6574bc99398c754e02432bb6b54cff9fa065035eeadaa8f16a86\": container with ID starting with 277361776a4d6574bc99398c754e02432bb6b54cff9fa065035eeadaa8f16a86 not found: ID does not exist" containerID="277361776a4d6574bc99398c754e02432bb6b54cff9fa065035eeadaa8f16a86" Oct 06 14:49:35 crc kubenswrapper[4867]: I1006 14:49:35.458646 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"277361776a4d6574bc99398c754e02432bb6b54cff9fa065035eeadaa8f16a86"} err="failed to get container status \"277361776a4d6574bc99398c754e02432bb6b54cff9fa065035eeadaa8f16a86\": rpc error: code = NotFound desc = could not find container \"277361776a4d6574bc99398c754e02432bb6b54cff9fa065035eeadaa8f16a86\": container with ID starting with 277361776a4d6574bc99398c754e02432bb6b54cff9fa065035eeadaa8f16a86 not found: ID does not exist" Oct 06 14:49:37 crc kubenswrapper[4867]: I1006 14:49:37.234318 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3580e79c-5f3f-406a-a4bb-27c9bb728779" path="/var/lib/kubelet/pods/3580e79c-5f3f-406a-a4bb-27c9bb728779/volumes" Oct 06 14:49:44 crc kubenswrapper[4867]: I1006 14:49:44.002245 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-trjgk_fa0bd3d7-281e-4e5d-adef-b6c38e2d6d83/nmstate-console-plugin/0.log" Oct 06 14:49:44 crc kubenswrapper[4867]: I1006 14:49:44.217068 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-wvv72_e19fdddd-1727-4c4b-985f-7548c278b0ca/nmstate-handler/0.log" Oct 06 14:49:44 crc kubenswrapper[4867]: I1006 14:49:44.262743 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vdksl_99a4464a-a11f-4a4e-86ae-43a9a76b060a/kube-rbac-proxy/0.log" Oct 06 14:49:44 crc kubenswrapper[4867]: I1006 14:49:44.273735 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-vdksl_99a4464a-a11f-4a4e-86ae-43a9a76b060a/nmstate-metrics/0.log" Oct 06 14:49:44 crc kubenswrapper[4867]: I1006 14:49:44.500159 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-b6qnd_1bed039e-de7f-49b2-b0fe-47e8bc055e8d/nmstate-operator/0.log" Oct 06 14:49:44 crc kubenswrapper[4867]: I1006 14:49:44.529211 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-fm444_5802445f-947f-4d52-b1f3-91f404ef0088/nmstate-webhook/0.log" Oct 06 14:49:58 crc kubenswrapper[4867]: I1006 14:49:58.956590 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-qs5gj_39dc72e9-c1d5-4257-b8ea-248aaed554e5/kube-rbac-proxy/0.log" Oct 06 14:49:59 crc kubenswrapper[4867]: I1006 14:49:59.160307 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-qs5gj_39dc72e9-c1d5-4257-b8ea-248aaed554e5/controller/0.log" Oct 06 14:49:59 crc kubenswrapper[4867]: I1006 14:49:59.169520 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-frr-files/0.log" Oct 06 14:49:59 crc kubenswrapper[4867]: I1006 14:49:59.399825 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-reloader/0.log" Oct 06 14:49:59 crc kubenswrapper[4867]: I1006 14:49:59.429651 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-frr-files/0.log" Oct 06 14:49:59 crc kubenswrapper[4867]: I1006 14:49:59.432074 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-metrics/0.log" Oct 06 14:49:59 crc kubenswrapper[4867]: I1006 14:49:59.436921 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-reloader/0.log" Oct 06 14:49:59 crc kubenswrapper[4867]: I1006 14:49:59.629218 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-frr-files/0.log" Oct 06 14:49:59 crc kubenswrapper[4867]: I1006 14:49:59.639192 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-reloader/0.log" Oct 06 14:49:59 crc kubenswrapper[4867]: I1006 14:49:59.664518 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-metrics/0.log" Oct 06 14:49:59 crc kubenswrapper[4867]: I1006 14:49:59.679623 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-metrics/0.log" Oct 06 14:49:59 crc kubenswrapper[4867]: I1006 14:49:59.876876 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-frr-files/0.log" Oct 06 14:49:59 crc kubenswrapper[4867]: I1006 14:49:59.878498 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/controller/0.log" Oct 06 14:49:59 crc kubenswrapper[4867]: I1006 14:49:59.891153 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-metrics/0.log" Oct 06 14:49:59 crc kubenswrapper[4867]: I1006 14:49:59.905154 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/cp-reloader/0.log" Oct 06 14:50:00 crc kubenswrapper[4867]: I1006 14:50:00.055011 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/frr-metrics/0.log" Oct 06 14:50:00 crc kubenswrapper[4867]: I1006 14:50:00.127793 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/kube-rbac-proxy/0.log" Oct 06 14:50:00 crc kubenswrapper[4867]: I1006 14:50:00.138077 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/kube-rbac-proxy-frr/0.log" Oct 06 14:50:00 crc kubenswrapper[4867]: I1006 14:50:00.345820 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/reloader/0.log" Oct 06 14:50:00 crc kubenswrapper[4867]: I1006 14:50:00.370888 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-xdc52_bab9da54-1204-49fd-af69-b48a1542d2e7/frr-k8s-webhook-server/0.log" Oct 06 14:50:00 crc kubenswrapper[4867]: I1006 14:50:00.582921 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5487d99769-x5czz_43844a7c-24fd-49b1-9860-6b4a63fc136a/manager/0.log" Oct 06 14:50:00 crc kubenswrapper[4867]: I1006 14:50:00.838697 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-9d5469fbf-r6fln_fb30a785-833d-47ee-be7f-5235fbfc826c/webhook-server/0.log" Oct 06 14:50:00 crc kubenswrapper[4867]: I1006 14:50:00.870042 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xn6jt_e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb/kube-rbac-proxy/0.log" Oct 06 14:50:01 crc kubenswrapper[4867]: I1006 14:50:01.670436 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xn6jt_e62a5b4f-ce8b-4681-a7fd-2453cd04e1cb/speaker/0.log" Oct 06 14:50:01 crc kubenswrapper[4867]: I1006 14:50:01.958102 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8wgsb_e88a96ad-49db-4ffa-b274-9160056bb4c9/frr/0.log" Oct 06 14:50:13 crc kubenswrapper[4867]: I1006 14:50:13.963691 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l_4ab89d13-c239-4d47-aa11-68d1ea20e6b1/util/0.log" Oct 06 14:50:14 crc kubenswrapper[4867]: I1006 14:50:14.302568 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l_4ab89d13-c239-4d47-aa11-68d1ea20e6b1/util/0.log" Oct 06 14:50:14 crc kubenswrapper[4867]: I1006 14:50:14.333780 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l_4ab89d13-c239-4d47-aa11-68d1ea20e6b1/pull/0.log" Oct 06 14:50:14 crc kubenswrapper[4867]: I1006 14:50:14.342933 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l_4ab89d13-c239-4d47-aa11-68d1ea20e6b1/pull/0.log" Oct 06 14:50:14 crc kubenswrapper[4867]: I1006 14:50:14.504491 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l_4ab89d13-c239-4d47-aa11-68d1ea20e6b1/util/0.log" Oct 06 14:50:14 crc kubenswrapper[4867]: I1006 14:50:14.524901 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l_4ab89d13-c239-4d47-aa11-68d1ea20e6b1/pull/0.log" Oct 06 14:50:14 crc kubenswrapper[4867]: I1006 14:50:14.598425 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d25576l_4ab89d13-c239-4d47-aa11-68d1ea20e6b1/extract/0.log" Oct 06 14:50:14 crc kubenswrapper[4867]: I1006 14:50:14.698307 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2_0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be/util/0.log" Oct 06 14:50:15 crc kubenswrapper[4867]: I1006 14:50:15.185868 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2_0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be/util/0.log" Oct 06 14:50:15 crc kubenswrapper[4867]: I1006 14:50:15.193447 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2_0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be/pull/0.log" Oct 06 14:50:15 crc kubenswrapper[4867]: I1006 14:50:15.205074 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2_0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be/pull/0.log" Oct 06 14:50:15 crc kubenswrapper[4867]: I1006 14:50:15.466811 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2_0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be/util/0.log" Oct 06 14:50:15 crc kubenswrapper[4867]: I1006 14:50:15.474383 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2_0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be/extract/0.log" Oct 06 14:50:15 crc kubenswrapper[4867]: I1006 14:50:15.495877 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2drqhk2_0dd77e1a-c0ba-4ebd-bbf3-118123d0d2be/pull/0.log" Oct 06 14:50:15 crc kubenswrapper[4867]: I1006 14:50:15.664529 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f85vr_f466559d-88ca-40ca-aa83-7dcf8cf436f4/extract-utilities/0.log" Oct 06 14:50:15 crc kubenswrapper[4867]: I1006 14:50:15.863555 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f85vr_f466559d-88ca-40ca-aa83-7dcf8cf436f4/extract-utilities/0.log" Oct 06 14:50:15 crc kubenswrapper[4867]: I1006 14:50:15.916473 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f85vr_f466559d-88ca-40ca-aa83-7dcf8cf436f4/extract-content/0.log" Oct 06 14:50:15 crc kubenswrapper[4867]: I1006 14:50:15.939977 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f85vr_f466559d-88ca-40ca-aa83-7dcf8cf436f4/extract-content/0.log" Oct 06 14:50:16 crc kubenswrapper[4867]: I1006 14:50:16.098965 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f85vr_f466559d-88ca-40ca-aa83-7dcf8cf436f4/extract-content/0.log" Oct 06 14:50:16 crc kubenswrapper[4867]: I1006 14:50:16.127678 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f85vr_f466559d-88ca-40ca-aa83-7dcf8cf436f4/extract-utilities/0.log" Oct 06 14:50:16 crc kubenswrapper[4867]: I1006 14:50:16.367730 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-f85vr_f466559d-88ca-40ca-aa83-7dcf8cf436f4/registry-server/0.log" Oct 06 14:50:16 crc kubenswrapper[4867]: I1006 14:50:16.381121 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-29nnd_977ff4f6-ca6e-4ba8-8db3-3eb828510a13/extract-utilities/0.log" Oct 06 14:50:16 crc kubenswrapper[4867]: I1006 14:50:16.540763 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-29nnd_977ff4f6-ca6e-4ba8-8db3-3eb828510a13/extract-content/0.log" Oct 06 14:50:16 crc kubenswrapper[4867]: I1006 14:50:16.560738 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-29nnd_977ff4f6-ca6e-4ba8-8db3-3eb828510a13/extract-utilities/0.log" Oct 06 14:50:16 crc kubenswrapper[4867]: I1006 14:50:16.560783 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-29nnd_977ff4f6-ca6e-4ba8-8db3-3eb828510a13/extract-content/0.log" Oct 06 14:50:16 crc kubenswrapper[4867]: I1006 14:50:16.735403 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-29nnd_977ff4f6-ca6e-4ba8-8db3-3eb828510a13/extract-content/0.log" Oct 06 14:50:16 crc kubenswrapper[4867]: I1006 14:50:16.758855 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-29nnd_977ff4f6-ca6e-4ba8-8db3-3eb828510a13/extract-utilities/0.log" Oct 06 14:50:16 crc kubenswrapper[4867]: I1006 14:50:16.989463 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc_2548a3ec-5354-4309-a045-1a29253ad94b/util/0.log" Oct 06 14:50:17 crc kubenswrapper[4867]: I1006 14:50:17.190188 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc_2548a3ec-5354-4309-a045-1a29253ad94b/pull/0.log" Oct 06 14:50:17 crc kubenswrapper[4867]: I1006 14:50:17.202222 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc_2548a3ec-5354-4309-a045-1a29253ad94b/pull/0.log" Oct 06 14:50:17 crc kubenswrapper[4867]: I1006 14:50:17.255351 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc_2548a3ec-5354-4309-a045-1a29253ad94b/util/0.log" Oct 06 14:50:17 crc kubenswrapper[4867]: I1006 14:50:17.594409 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc_2548a3ec-5354-4309-a045-1a29253ad94b/util/0.log" Oct 06 14:50:17 crc kubenswrapper[4867]: I1006 14:50:17.638590 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc_2548a3ec-5354-4309-a045-1a29253ad94b/extract/0.log" Oct 06 14:50:17 crc kubenswrapper[4867]: I1006 14:50:17.678841 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c4nfqc_2548a3ec-5354-4309-a045-1a29253ad94b/pull/0.log" Oct 06 14:50:17 crc kubenswrapper[4867]: I1006 14:50:17.891226 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-29nnd_977ff4f6-ca6e-4ba8-8db3-3eb828510a13/registry-server/0.log" Oct 06 14:50:17 crc kubenswrapper[4867]: I1006 14:50:17.948778 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rdmr9_298bf2ee-baaf-4fbb-a107-d712667f246e/marketplace-operator/0.log" Oct 06 14:50:18 crc kubenswrapper[4867]: I1006 14:50:18.064694 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-khr9q_a3d6b459-d981-4cbe-8658-27860d930c81/extract-utilities/0.log" Oct 06 14:50:18 crc kubenswrapper[4867]: I1006 14:50:18.222046 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-khr9q_a3d6b459-d981-4cbe-8658-27860d930c81/extract-utilities/0.log" Oct 06 14:50:18 crc kubenswrapper[4867]: I1006 14:50:18.237471 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-khr9q_a3d6b459-d981-4cbe-8658-27860d930c81/extract-content/0.log" Oct 06 14:50:18 crc kubenswrapper[4867]: I1006 14:50:18.247938 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-khr9q_a3d6b459-d981-4cbe-8658-27860d930c81/extract-content/0.log" Oct 06 14:50:18 crc kubenswrapper[4867]: I1006 14:50:18.420544 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-khr9q_a3d6b459-d981-4cbe-8658-27860d930c81/extract-content/0.log" Oct 06 14:50:18 crc kubenswrapper[4867]: I1006 14:50:18.472310 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bpzds_1afa423a-dc5a-4b79-b4ea-5868f9ea04b3/extract-utilities/0.log" Oct 06 14:50:18 crc kubenswrapper[4867]: I1006 14:50:18.473783 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-khr9q_a3d6b459-d981-4cbe-8658-27860d930c81/extract-utilities/0.log" Oct 06 14:50:18 crc kubenswrapper[4867]: I1006 14:50:18.646918 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-khr9q_a3d6b459-d981-4cbe-8658-27860d930c81/registry-server/0.log" Oct 06 14:50:18 crc kubenswrapper[4867]: I1006 14:50:18.674076 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bpzds_1afa423a-dc5a-4b79-b4ea-5868f9ea04b3/extract-content/0.log" Oct 06 14:50:18 crc kubenswrapper[4867]: I1006 14:50:18.743627 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bpzds_1afa423a-dc5a-4b79-b4ea-5868f9ea04b3/extract-utilities/0.log" Oct 06 14:50:18 crc kubenswrapper[4867]: I1006 14:50:18.745472 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bpzds_1afa423a-dc5a-4b79-b4ea-5868f9ea04b3/extract-content/0.log" Oct 06 14:50:18 crc kubenswrapper[4867]: I1006 14:50:18.893198 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bpzds_1afa423a-dc5a-4b79-b4ea-5868f9ea04b3/extract-utilities/0.log" Oct 06 14:50:18 crc kubenswrapper[4867]: I1006 14:50:18.895493 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bpzds_1afa423a-dc5a-4b79-b4ea-5868f9ea04b3/extract-content/0.log" Oct 06 14:50:19 crc kubenswrapper[4867]: I1006 14:50:19.114830 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bpzds_1afa423a-dc5a-4b79-b4ea-5868f9ea04b3/registry-server/0.log" Oct 06 14:50:29 crc kubenswrapper[4867]: I1006 14:50:29.731496 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-rxd2f_1b226da8-0bf8-4ead-b308-6677288373a3/prometheus-operator/0.log" Oct 06 14:50:29 crc kubenswrapper[4867]: I1006 14:50:29.852608 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7c667696bd-98f7n_b47e5b18-abb6-4dc9-bc90-c37e31034183/prometheus-operator-admission-webhook/0.log" Oct 06 14:50:29 crc kubenswrapper[4867]: I1006 14:50:29.926635 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7c667696bd-blr54_4c780336-2ad2-49ef-97b4-0161e4dceb44/prometheus-operator-admission-webhook/0.log" Oct 06 14:50:30 crc kubenswrapper[4867]: I1006 14:50:30.102404 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-pkjkr_d4f4e099-818f-4e18-b1d2-dc026962eb51/operator/0.log" Oct 06 14:50:30 crc kubenswrapper[4867]: I1006 14:50:30.109391 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-zm28w_cb9ae008-7e15-4aa1-84fa-93f513646286/perses-operator/0.log" Oct 06 14:51:12 crc kubenswrapper[4867]: I1006 14:51:12.873314 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:51:12 crc kubenswrapper[4867]: I1006 14:51:12.875027 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:51:17 crc kubenswrapper[4867]: I1006 14:51:17.912927 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kkb2p"] Oct 06 14:51:17 crc kubenswrapper[4867]: E1006 14:51:17.914525 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3580e79c-5f3f-406a-a4bb-27c9bb728779" containerName="extract-utilities" Oct 06 14:51:17 crc kubenswrapper[4867]: I1006 14:51:17.914544 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3580e79c-5f3f-406a-a4bb-27c9bb728779" containerName="extract-utilities" Oct 06 14:51:17 crc kubenswrapper[4867]: E1006 14:51:17.914574 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3580e79c-5f3f-406a-a4bb-27c9bb728779" containerName="registry-server" Oct 06 14:51:17 crc kubenswrapper[4867]: I1006 14:51:17.914583 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3580e79c-5f3f-406a-a4bb-27c9bb728779" containerName="registry-server" Oct 06 14:51:17 crc kubenswrapper[4867]: E1006 14:51:17.914598 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3580e79c-5f3f-406a-a4bb-27c9bb728779" containerName="extract-content" Oct 06 14:51:17 crc kubenswrapper[4867]: I1006 14:51:17.914605 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="3580e79c-5f3f-406a-a4bb-27c9bb728779" containerName="extract-content" Oct 06 14:51:17 crc kubenswrapper[4867]: I1006 14:51:17.915182 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="3580e79c-5f3f-406a-a4bb-27c9bb728779" containerName="registry-server" Oct 06 14:51:17 crc kubenswrapper[4867]: I1006 14:51:17.917396 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:17 crc kubenswrapper[4867]: I1006 14:51:17.926445 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kkb2p"] Oct 06 14:51:18 crc kubenswrapper[4867]: I1006 14:51:18.081736 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-catalog-content\") pod \"certified-operators-kkb2p\" (UID: \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\") " pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:18 crc kubenswrapper[4867]: I1006 14:51:18.081850 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-utilities\") pod \"certified-operators-kkb2p\" (UID: \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\") " pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:18 crc kubenswrapper[4867]: I1006 14:51:18.081945 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d94pt\" (UniqueName: \"kubernetes.io/projected/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-kube-api-access-d94pt\") pod \"certified-operators-kkb2p\" (UID: \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\") " pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:18 crc kubenswrapper[4867]: I1006 14:51:18.183541 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d94pt\" (UniqueName: \"kubernetes.io/projected/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-kube-api-access-d94pt\") pod \"certified-operators-kkb2p\" (UID: \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\") " pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:18 crc kubenswrapper[4867]: I1006 14:51:18.183660 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-catalog-content\") pod \"certified-operators-kkb2p\" (UID: \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\") " pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:18 crc kubenswrapper[4867]: I1006 14:51:18.183718 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-utilities\") pod \"certified-operators-kkb2p\" (UID: \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\") " pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:18 crc kubenswrapper[4867]: I1006 14:51:18.184166 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-catalog-content\") pod \"certified-operators-kkb2p\" (UID: \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\") " pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:18 crc kubenswrapper[4867]: I1006 14:51:18.184215 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-utilities\") pod \"certified-operators-kkb2p\" (UID: \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\") " pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:18 crc kubenswrapper[4867]: I1006 14:51:18.209911 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d94pt\" (UniqueName: \"kubernetes.io/projected/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-kube-api-access-d94pt\") pod \"certified-operators-kkb2p\" (UID: \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\") " pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:18 crc kubenswrapper[4867]: I1006 14:51:18.246755 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:18 crc kubenswrapper[4867]: I1006 14:51:18.784300 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kkb2p"] Oct 06 14:51:19 crc kubenswrapper[4867]: I1006 14:51:19.378310 4867 generic.go:334] "Generic (PLEG): container finished" podID="dbf3e2e9-ad6e-4b34-acce-f3654649c69a" containerID="21a44139a191cf885787e43214c732d39825cd07d49d6c52ac52885ce17ed169" exitCode=0 Oct 06 14:51:19 crc kubenswrapper[4867]: I1006 14:51:19.378428 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kkb2p" event={"ID":"dbf3e2e9-ad6e-4b34-acce-f3654649c69a","Type":"ContainerDied","Data":"21a44139a191cf885787e43214c732d39825cd07d49d6c52ac52885ce17ed169"} Oct 06 14:51:19 crc kubenswrapper[4867]: I1006 14:51:19.378783 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kkb2p" event={"ID":"dbf3e2e9-ad6e-4b34-acce-f3654649c69a","Type":"ContainerStarted","Data":"2dcfc5975a17215b3f8d2fdc0e67c08d92ea52c82225ebb190a7b9c616fb5aa0"} Oct 06 14:51:19 crc kubenswrapper[4867]: I1006 14:51:19.381160 4867 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 14:51:20 crc kubenswrapper[4867]: I1006 14:51:20.389243 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kkb2p" event={"ID":"dbf3e2e9-ad6e-4b34-acce-f3654649c69a","Type":"ContainerStarted","Data":"ca8c09d79d0b0eadf62da0b8848049b4bd4c18fbf6be51db3afddb2dd91b9a54"} Oct 06 14:51:21 crc kubenswrapper[4867]: I1006 14:51:21.398572 4867 generic.go:334] "Generic (PLEG): container finished" podID="dbf3e2e9-ad6e-4b34-acce-f3654649c69a" containerID="ca8c09d79d0b0eadf62da0b8848049b4bd4c18fbf6be51db3afddb2dd91b9a54" exitCode=0 Oct 06 14:51:21 crc kubenswrapper[4867]: I1006 14:51:21.398663 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kkb2p" event={"ID":"dbf3e2e9-ad6e-4b34-acce-f3654649c69a","Type":"ContainerDied","Data":"ca8c09d79d0b0eadf62da0b8848049b4bd4c18fbf6be51db3afddb2dd91b9a54"} Oct 06 14:51:22 crc kubenswrapper[4867]: I1006 14:51:22.409333 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kkb2p" event={"ID":"dbf3e2e9-ad6e-4b34-acce-f3654649c69a","Type":"ContainerStarted","Data":"79e0097d294d8ab95eb0c846c2ec1f73c641fd1eab989c3dd22c4f6841c1447f"} Oct 06 14:51:22 crc kubenswrapper[4867]: I1006 14:51:22.435284 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kkb2p" podStartSLOduration=3.019491346 podStartE2EDuration="5.435248838s" podCreationTimestamp="2025-10-06 14:51:17 +0000 UTC" firstStartedPulling="2025-10-06 14:51:19.380216548 +0000 UTC m=+6458.838164692" lastFinishedPulling="2025-10-06 14:51:21.79597404 +0000 UTC m=+6461.253922184" observedRunningTime="2025-10-06 14:51:22.428069855 +0000 UTC m=+6461.886017999" watchObservedRunningTime="2025-10-06 14:51:22.435248838 +0000 UTC m=+6461.893196982" Oct 06 14:51:28 crc kubenswrapper[4867]: I1006 14:51:28.247932 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:28 crc kubenswrapper[4867]: I1006 14:51:28.248552 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:28 crc kubenswrapper[4867]: I1006 14:51:28.296045 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:28 crc kubenswrapper[4867]: I1006 14:51:28.553737 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:28 crc kubenswrapper[4867]: I1006 14:51:28.605357 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kkb2p"] Oct 06 14:51:30 crc kubenswrapper[4867]: I1006 14:51:30.522073 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kkb2p" podUID="dbf3e2e9-ad6e-4b34-acce-f3654649c69a" containerName="registry-server" containerID="cri-o://79e0097d294d8ab95eb0c846c2ec1f73c641fd1eab989c3dd22c4f6841c1447f" gracePeriod=2 Oct 06 14:51:30 crc kubenswrapper[4867]: I1006 14:51:30.970005 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.044779 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-utilities\") pod \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\" (UID: \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\") " Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.045491 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d94pt\" (UniqueName: \"kubernetes.io/projected/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-kube-api-access-d94pt\") pod \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\" (UID: \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\") " Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.045554 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-catalog-content\") pod \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\" (UID: \"dbf3e2e9-ad6e-4b34-acce-f3654649c69a\") " Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.052045 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-kube-api-access-d94pt" (OuterVolumeSpecName: "kube-api-access-d94pt") pod "dbf3e2e9-ad6e-4b34-acce-f3654649c69a" (UID: "dbf3e2e9-ad6e-4b34-acce-f3654649c69a"). InnerVolumeSpecName "kube-api-access-d94pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.055023 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-utilities" (OuterVolumeSpecName: "utilities") pod "dbf3e2e9-ad6e-4b34-acce-f3654649c69a" (UID: "dbf3e2e9-ad6e-4b34-acce-f3654649c69a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.109409 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbf3e2e9-ad6e-4b34-acce-f3654649c69a" (UID: "dbf3e2e9-ad6e-4b34-acce-f3654649c69a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.147596 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d94pt\" (UniqueName: \"kubernetes.io/projected/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-kube-api-access-d94pt\") on node \"crc\" DevicePath \"\"" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.147637 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.147649 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf3e2e9-ad6e-4b34-acce-f3654649c69a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.533696 4867 generic.go:334] "Generic (PLEG): container finished" podID="dbf3e2e9-ad6e-4b34-acce-f3654649c69a" containerID="79e0097d294d8ab95eb0c846c2ec1f73c641fd1eab989c3dd22c4f6841c1447f" exitCode=0 Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.533752 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kkb2p" event={"ID":"dbf3e2e9-ad6e-4b34-acce-f3654649c69a","Type":"ContainerDied","Data":"79e0097d294d8ab95eb0c846c2ec1f73c641fd1eab989c3dd22c4f6841c1447f"} Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.533798 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kkb2p" event={"ID":"dbf3e2e9-ad6e-4b34-acce-f3654649c69a","Type":"ContainerDied","Data":"2dcfc5975a17215b3f8d2fdc0e67c08d92ea52c82225ebb190a7b9c616fb5aa0"} Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.533821 4867 scope.go:117] "RemoveContainer" containerID="79e0097d294d8ab95eb0c846c2ec1f73c641fd1eab989c3dd22c4f6841c1447f" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.534967 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kkb2p" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.568458 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kkb2p"] Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.569971 4867 scope.go:117] "RemoveContainer" containerID="ca8c09d79d0b0eadf62da0b8848049b4bd4c18fbf6be51db3afddb2dd91b9a54" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.578710 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kkb2p"] Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.600608 4867 scope.go:117] "RemoveContainer" containerID="21a44139a191cf885787e43214c732d39825cd07d49d6c52ac52885ce17ed169" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.646504 4867 scope.go:117] "RemoveContainer" containerID="79e0097d294d8ab95eb0c846c2ec1f73c641fd1eab989c3dd22c4f6841c1447f" Oct 06 14:51:31 crc kubenswrapper[4867]: E1006 14:51:31.647177 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79e0097d294d8ab95eb0c846c2ec1f73c641fd1eab989c3dd22c4f6841c1447f\": container with ID starting with 79e0097d294d8ab95eb0c846c2ec1f73c641fd1eab989c3dd22c4f6841c1447f not found: ID does not exist" containerID="79e0097d294d8ab95eb0c846c2ec1f73c641fd1eab989c3dd22c4f6841c1447f" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.647296 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e0097d294d8ab95eb0c846c2ec1f73c641fd1eab989c3dd22c4f6841c1447f"} err="failed to get container status \"79e0097d294d8ab95eb0c846c2ec1f73c641fd1eab989c3dd22c4f6841c1447f\": rpc error: code = NotFound desc = could not find container \"79e0097d294d8ab95eb0c846c2ec1f73c641fd1eab989c3dd22c4f6841c1447f\": container with ID starting with 79e0097d294d8ab95eb0c846c2ec1f73c641fd1eab989c3dd22c4f6841c1447f not found: ID does not exist" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.647390 4867 scope.go:117] "RemoveContainer" containerID="ca8c09d79d0b0eadf62da0b8848049b4bd4c18fbf6be51db3afddb2dd91b9a54" Oct 06 14:51:31 crc kubenswrapper[4867]: E1006 14:51:31.647688 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca8c09d79d0b0eadf62da0b8848049b4bd4c18fbf6be51db3afddb2dd91b9a54\": container with ID starting with ca8c09d79d0b0eadf62da0b8848049b4bd4c18fbf6be51db3afddb2dd91b9a54 not found: ID does not exist" containerID="ca8c09d79d0b0eadf62da0b8848049b4bd4c18fbf6be51db3afddb2dd91b9a54" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.647791 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca8c09d79d0b0eadf62da0b8848049b4bd4c18fbf6be51db3afddb2dd91b9a54"} err="failed to get container status \"ca8c09d79d0b0eadf62da0b8848049b4bd4c18fbf6be51db3afddb2dd91b9a54\": rpc error: code = NotFound desc = could not find container \"ca8c09d79d0b0eadf62da0b8848049b4bd4c18fbf6be51db3afddb2dd91b9a54\": container with ID starting with ca8c09d79d0b0eadf62da0b8848049b4bd4c18fbf6be51db3afddb2dd91b9a54 not found: ID does not exist" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.647871 4867 scope.go:117] "RemoveContainer" containerID="21a44139a191cf885787e43214c732d39825cd07d49d6c52ac52885ce17ed169" Oct 06 14:51:31 crc kubenswrapper[4867]: E1006 14:51:31.648233 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21a44139a191cf885787e43214c732d39825cd07d49d6c52ac52885ce17ed169\": container with ID starting with 21a44139a191cf885787e43214c732d39825cd07d49d6c52ac52885ce17ed169 not found: ID does not exist" containerID="21a44139a191cf885787e43214c732d39825cd07d49d6c52ac52885ce17ed169" Oct 06 14:51:31 crc kubenswrapper[4867]: I1006 14:51:31.648334 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21a44139a191cf885787e43214c732d39825cd07d49d6c52ac52885ce17ed169"} err="failed to get container status \"21a44139a191cf885787e43214c732d39825cd07d49d6c52ac52885ce17ed169\": rpc error: code = NotFound desc = could not find container \"21a44139a191cf885787e43214c732d39825cd07d49d6c52ac52885ce17ed169\": container with ID starting with 21a44139a191cf885787e43214c732d39825cd07d49d6c52ac52885ce17ed169 not found: ID does not exist" Oct 06 14:51:33 crc kubenswrapper[4867]: I1006 14:51:33.247584 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf3e2e9-ad6e-4b34-acce-f3654649c69a" path="/var/lib/kubelet/pods/dbf3e2e9-ad6e-4b34-acce-f3654649c69a/volumes" Oct 06 14:51:42 crc kubenswrapper[4867]: I1006 14:51:42.873889 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:51:42 crc kubenswrapper[4867]: I1006 14:51:42.874634 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:52:12 crc kubenswrapper[4867]: I1006 14:52:12.873288 4867 patch_prober.go:28] interesting pod/machine-config-daemon-shmxq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:52:12 crc kubenswrapper[4867]: I1006 14:52:12.874038 4867 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:52:12 crc kubenswrapper[4867]: I1006 14:52:12.874148 4867 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" Oct 06 14:52:12 crc kubenswrapper[4867]: I1006 14:52:12.875058 4867 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82af3ea6fa0b81208e9085b645b6be2009676535f0afc1fdf1ac686fc6759a4d"} pod="openshift-machine-config-operator/machine-config-daemon-shmxq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 14:52:12 crc kubenswrapper[4867]: I1006 14:52:12.875105 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" podUID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerName="machine-config-daemon" containerID="cri-o://82af3ea6fa0b81208e9085b645b6be2009676535f0afc1fdf1ac686fc6759a4d" gracePeriod=600 Oct 06 14:52:13 crc kubenswrapper[4867]: I1006 14:52:13.948409 4867 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc284-392f-4e65-9f43-cb9ced2e47d3" containerID="82af3ea6fa0b81208e9085b645b6be2009676535f0afc1fdf1ac686fc6759a4d" exitCode=0 Oct 06 14:52:13 crc kubenswrapper[4867]: I1006 14:52:13.948482 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerDied","Data":"82af3ea6fa0b81208e9085b645b6be2009676535f0afc1fdf1ac686fc6759a4d"} Oct 06 14:52:13 crc kubenswrapper[4867]: I1006 14:52:13.950673 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-shmxq" event={"ID":"9f5dc284-392f-4e65-9f43-cb9ced2e47d3","Type":"ContainerStarted","Data":"598b440a6d553336cdb1ee7d602bc6b98e8cf575373306d95da17e53528aa534"} Oct 06 14:52:13 crc kubenswrapper[4867]: I1006 14:52:13.950775 4867 scope.go:117] "RemoveContainer" containerID="75ddd9ef21f287ef932ed1310accc5cffed19c73af89a847a3a80323e37d781b" Oct 06 14:52:50 crc kubenswrapper[4867]: I1006 14:52:50.296529 4867 generic.go:334] "Generic (PLEG): container finished" podID="0c0c2dd3-6303-4a00-be14-8af305c08842" containerID="0dbc7aa1c2d805835e66377ce807601a751aa5336dc091b3a8d6c8bbb98ca668" exitCode=0 Oct 06 14:52:50 crc kubenswrapper[4867]: I1006 14:52:50.296640 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fq2cw/must-gather-7xspx" event={"ID":"0c0c2dd3-6303-4a00-be14-8af305c08842","Type":"ContainerDied","Data":"0dbc7aa1c2d805835e66377ce807601a751aa5336dc091b3a8d6c8bbb98ca668"} Oct 06 14:52:50 crc kubenswrapper[4867]: I1006 14:52:50.298007 4867 scope.go:117] "RemoveContainer" containerID="0dbc7aa1c2d805835e66377ce807601a751aa5336dc091b3a8d6c8bbb98ca668" Oct 06 14:52:50 crc kubenswrapper[4867]: I1006 14:52:50.552761 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fq2cw_must-gather-7xspx_0c0c2dd3-6303-4a00-be14-8af305c08842/gather/0.log" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.303038 4867 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m99tm"] Oct 06 14:52:52 crc kubenswrapper[4867]: E1006 14:52:52.304662 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf3e2e9-ad6e-4b34-acce-f3654649c69a" containerName="registry-server" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.304681 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf3e2e9-ad6e-4b34-acce-f3654649c69a" containerName="registry-server" Oct 06 14:52:52 crc kubenswrapper[4867]: E1006 14:52:52.304710 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf3e2e9-ad6e-4b34-acce-f3654649c69a" containerName="extract-utilities" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.304723 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf3e2e9-ad6e-4b34-acce-f3654649c69a" containerName="extract-utilities" Oct 06 14:52:52 crc kubenswrapper[4867]: E1006 14:52:52.304785 4867 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf3e2e9-ad6e-4b34-acce-f3654649c69a" containerName="extract-content" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.304795 4867 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf3e2e9-ad6e-4b34-acce-f3654649c69a" containerName="extract-content" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.305073 4867 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf3e2e9-ad6e-4b34-acce-f3654649c69a" containerName="registry-server" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.310579 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.312209 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m99tm"] Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.483377 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a97458f-cee3-4b6e-b17a-ec43fa850edb-utilities\") pod \"redhat-operators-m99tm\" (UID: \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\") " pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.483767 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/0a97458f-cee3-4b6e-b17a-ec43fa850edb-kube-api-access-jkg9b\") pod \"redhat-operators-m99tm\" (UID: \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\") " pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.483924 4867 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a97458f-cee3-4b6e-b17a-ec43fa850edb-catalog-content\") pod \"redhat-operators-m99tm\" (UID: \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\") " pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.585809 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a97458f-cee3-4b6e-b17a-ec43fa850edb-utilities\") pod \"redhat-operators-m99tm\" (UID: \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\") " pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.585888 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/0a97458f-cee3-4b6e-b17a-ec43fa850edb-kube-api-access-jkg9b\") pod \"redhat-operators-m99tm\" (UID: \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\") " pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.585925 4867 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a97458f-cee3-4b6e-b17a-ec43fa850edb-catalog-content\") pod \"redhat-operators-m99tm\" (UID: \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\") " pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.586752 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a97458f-cee3-4b6e-b17a-ec43fa850edb-utilities\") pod \"redhat-operators-m99tm\" (UID: \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\") " pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.587095 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a97458f-cee3-4b6e-b17a-ec43fa850edb-catalog-content\") pod \"redhat-operators-m99tm\" (UID: \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\") " pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.612332 4867 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/0a97458f-cee3-4b6e-b17a-ec43fa850edb-kube-api-access-jkg9b\") pod \"redhat-operators-m99tm\" (UID: \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\") " pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:52:52 crc kubenswrapper[4867]: I1006 14:52:52.637844 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:52:53 crc kubenswrapper[4867]: I1006 14:52:53.128580 4867 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m99tm"] Oct 06 14:52:53 crc kubenswrapper[4867]: I1006 14:52:53.328893 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m99tm" event={"ID":"0a97458f-cee3-4b6e-b17a-ec43fa850edb","Type":"ContainerStarted","Data":"aa4ebbbb41c49dcb875e225249aafee8e300c4d6b206178f62198c458d5118af"} Oct 06 14:52:54 crc kubenswrapper[4867]: I1006 14:52:54.354396 4867 generic.go:334] "Generic (PLEG): container finished" podID="0a97458f-cee3-4b6e-b17a-ec43fa850edb" containerID="590a4c1837452861755281b82ab5f22cf4f7e98def6af5221cb8799e4d7441ae" exitCode=0 Oct 06 14:52:54 crc kubenswrapper[4867]: I1006 14:52:54.354836 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m99tm" event={"ID":"0a97458f-cee3-4b6e-b17a-ec43fa850edb","Type":"ContainerDied","Data":"590a4c1837452861755281b82ab5f22cf4f7e98def6af5221cb8799e4d7441ae"} Oct 06 14:52:56 crc kubenswrapper[4867]: I1006 14:52:56.378184 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m99tm" event={"ID":"0a97458f-cee3-4b6e-b17a-ec43fa850edb","Type":"ContainerStarted","Data":"c953381540cb09102ac0380ee3eb434cefcb71f767acb250cbef9a77ab200f2f"} Oct 06 14:53:00 crc kubenswrapper[4867]: I1006 14:53:00.423913 4867 generic.go:334] "Generic (PLEG): container finished" podID="0a97458f-cee3-4b6e-b17a-ec43fa850edb" containerID="c953381540cb09102ac0380ee3eb434cefcb71f767acb250cbef9a77ab200f2f" exitCode=0 Oct 06 14:53:00 crc kubenswrapper[4867]: I1006 14:53:00.424042 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m99tm" event={"ID":"0a97458f-cee3-4b6e-b17a-ec43fa850edb","Type":"ContainerDied","Data":"c953381540cb09102ac0380ee3eb434cefcb71f767acb250cbef9a77ab200f2f"} Oct 06 14:53:01 crc kubenswrapper[4867]: I1006 14:53:01.436693 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m99tm" event={"ID":"0a97458f-cee3-4b6e-b17a-ec43fa850edb","Type":"ContainerStarted","Data":"461ba91cabb243e584bf540fff7b61a63af2bf69a837b5d0dd2fc269788b9283"} Oct 06 14:53:01 crc kubenswrapper[4867]: I1006 14:53:01.460278 4867 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m99tm" podStartSLOduration=2.942775838 podStartE2EDuration="9.46024582s" podCreationTimestamp="2025-10-06 14:52:52 +0000 UTC" firstStartedPulling="2025-10-06 14:52:54.35794397 +0000 UTC m=+6553.815892104" lastFinishedPulling="2025-10-06 14:53:00.875413932 +0000 UTC m=+6560.333362086" observedRunningTime="2025-10-06 14:53:01.456617302 +0000 UTC m=+6560.914565446" watchObservedRunningTime="2025-10-06 14:53:01.46024582 +0000 UTC m=+6560.918193964" Oct 06 14:53:02 crc kubenswrapper[4867]: I1006 14:53:02.639118 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:53:02 crc kubenswrapper[4867]: I1006 14:53:02.639473 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:53:03 crc kubenswrapper[4867]: I1006 14:53:03.693627 4867 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m99tm" podUID="0a97458f-cee3-4b6e-b17a-ec43fa850edb" containerName="registry-server" probeResult="failure" output=< Oct 06 14:53:03 crc kubenswrapper[4867]: timeout: failed to connect service ":50051" within 1s Oct 06 14:53:03 crc kubenswrapper[4867]: > Oct 06 14:53:04 crc kubenswrapper[4867]: I1006 14:53:04.123899 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fq2cw/must-gather-7xspx"] Oct 06 14:53:04 crc kubenswrapper[4867]: I1006 14:53:04.124596 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fq2cw/must-gather-7xspx" podUID="0c0c2dd3-6303-4a00-be14-8af305c08842" containerName="copy" containerID="cri-o://f20472ab52eeba7cd39ec5a95b4f98cc2bd3582cef1c37987d053fa461cdffc2" gracePeriod=2 Oct 06 14:53:04 crc kubenswrapper[4867]: I1006 14:53:04.135575 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fq2cw/must-gather-7xspx"] Oct 06 14:53:04 crc kubenswrapper[4867]: I1006 14:53:04.469350 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fq2cw_must-gather-7xspx_0c0c2dd3-6303-4a00-be14-8af305c08842/copy/0.log" Oct 06 14:53:04 crc kubenswrapper[4867]: I1006 14:53:04.470043 4867 generic.go:334] "Generic (PLEG): container finished" podID="0c0c2dd3-6303-4a00-be14-8af305c08842" containerID="f20472ab52eeba7cd39ec5a95b4f98cc2bd3582cef1c37987d053fa461cdffc2" exitCode=143 Oct 06 14:53:04 crc kubenswrapper[4867]: I1006 14:53:04.600912 4867 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fq2cw_must-gather-7xspx_0c0c2dd3-6303-4a00-be14-8af305c08842/copy/0.log" Oct 06 14:53:04 crc kubenswrapper[4867]: I1006 14:53:04.601417 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/must-gather-7xspx" Oct 06 14:53:04 crc kubenswrapper[4867]: I1006 14:53:04.668501 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0c0c2dd3-6303-4a00-be14-8af305c08842-must-gather-output\") pod \"0c0c2dd3-6303-4a00-be14-8af305c08842\" (UID: \"0c0c2dd3-6303-4a00-be14-8af305c08842\") " Oct 06 14:53:04 crc kubenswrapper[4867]: I1006 14:53:04.668659 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxqzl\" (UniqueName: \"kubernetes.io/projected/0c0c2dd3-6303-4a00-be14-8af305c08842-kube-api-access-kxqzl\") pod \"0c0c2dd3-6303-4a00-be14-8af305c08842\" (UID: \"0c0c2dd3-6303-4a00-be14-8af305c08842\") " Oct 06 14:53:04 crc kubenswrapper[4867]: I1006 14:53:04.681432 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c0c2dd3-6303-4a00-be14-8af305c08842-kube-api-access-kxqzl" (OuterVolumeSpecName: "kube-api-access-kxqzl") pod "0c0c2dd3-6303-4a00-be14-8af305c08842" (UID: "0c0c2dd3-6303-4a00-be14-8af305c08842"). InnerVolumeSpecName "kube-api-access-kxqzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:04 crc kubenswrapper[4867]: I1006 14:53:04.770468 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxqzl\" (UniqueName: \"kubernetes.io/projected/0c0c2dd3-6303-4a00-be14-8af305c08842-kube-api-access-kxqzl\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:04 crc kubenswrapper[4867]: I1006 14:53:04.824291 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c0c2dd3-6303-4a00-be14-8af305c08842-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0c0c2dd3-6303-4a00-be14-8af305c08842" (UID: "0c0c2dd3-6303-4a00-be14-8af305c08842"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:53:04 crc kubenswrapper[4867]: I1006 14:53:04.913656 4867 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0c0c2dd3-6303-4a00-be14-8af305c08842-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:05 crc kubenswrapper[4867]: I1006 14:53:05.239083 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c0c2dd3-6303-4a00-be14-8af305c08842" path="/var/lib/kubelet/pods/0c0c2dd3-6303-4a00-be14-8af305c08842/volumes" Oct 06 14:53:05 crc kubenswrapper[4867]: I1006 14:53:05.252855 4867 scope.go:117] "RemoveContainer" containerID="f20472ab52eeba7cd39ec5a95b4f98cc2bd3582cef1c37987d053fa461cdffc2" Oct 06 14:53:05 crc kubenswrapper[4867]: I1006 14:53:05.304532 4867 scope.go:117] "RemoveContainer" containerID="0dbc7aa1c2d805835e66377ce807601a751aa5336dc091b3a8d6c8bbb98ca668" Oct 06 14:53:05 crc kubenswrapper[4867]: I1006 14:53:05.479002 4867 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fq2cw/must-gather-7xspx" Oct 06 14:53:12 crc kubenswrapper[4867]: I1006 14:53:12.690469 4867 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:53:12 crc kubenswrapper[4867]: I1006 14:53:12.736575 4867 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:53:12 crc kubenswrapper[4867]: I1006 14:53:12.920644 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m99tm"] Oct 06 14:53:14 crc kubenswrapper[4867]: I1006 14:53:14.564129 4867 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m99tm" podUID="0a97458f-cee3-4b6e-b17a-ec43fa850edb" containerName="registry-server" containerID="cri-o://461ba91cabb243e584bf540fff7b61a63af2bf69a837b5d0dd2fc269788b9283" gracePeriod=2 Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.064408 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.124975 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a97458f-cee3-4b6e-b17a-ec43fa850edb-catalog-content\") pod \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\" (UID: \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\") " Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.125396 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a97458f-cee3-4b6e-b17a-ec43fa850edb-utilities\") pod \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\" (UID: \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\") " Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.125587 4867 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/0a97458f-cee3-4b6e-b17a-ec43fa850edb-kube-api-access-jkg9b\") pod \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\" (UID: \"0a97458f-cee3-4b6e-b17a-ec43fa850edb\") " Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.126247 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a97458f-cee3-4b6e-b17a-ec43fa850edb-utilities" (OuterVolumeSpecName: "utilities") pod "0a97458f-cee3-4b6e-b17a-ec43fa850edb" (UID: "0a97458f-cee3-4b6e-b17a-ec43fa850edb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.126962 4867 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a97458f-cee3-4b6e-b17a-ec43fa850edb-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.133752 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a97458f-cee3-4b6e-b17a-ec43fa850edb-kube-api-access-jkg9b" (OuterVolumeSpecName: "kube-api-access-jkg9b") pod "0a97458f-cee3-4b6e-b17a-ec43fa850edb" (UID: "0a97458f-cee3-4b6e-b17a-ec43fa850edb"). InnerVolumeSpecName "kube-api-access-jkg9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.218678 4867 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a97458f-cee3-4b6e-b17a-ec43fa850edb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a97458f-cee3-4b6e-b17a-ec43fa850edb" (UID: "0a97458f-cee3-4b6e-b17a-ec43fa850edb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.229046 4867 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/0a97458f-cee3-4b6e-b17a-ec43fa850edb-kube-api-access-jkg9b\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.229090 4867 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a97458f-cee3-4b6e-b17a-ec43fa850edb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.573725 4867 generic.go:334] "Generic (PLEG): container finished" podID="0a97458f-cee3-4b6e-b17a-ec43fa850edb" containerID="461ba91cabb243e584bf540fff7b61a63af2bf69a837b5d0dd2fc269788b9283" exitCode=0 Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.573764 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m99tm" event={"ID":"0a97458f-cee3-4b6e-b17a-ec43fa850edb","Type":"ContainerDied","Data":"461ba91cabb243e584bf540fff7b61a63af2bf69a837b5d0dd2fc269788b9283"} Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.573792 4867 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m99tm" event={"ID":"0a97458f-cee3-4b6e-b17a-ec43fa850edb","Type":"ContainerDied","Data":"aa4ebbbb41c49dcb875e225249aafee8e300c4d6b206178f62198c458d5118af"} Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.573801 4867 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m99tm" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.573812 4867 scope.go:117] "RemoveContainer" containerID="461ba91cabb243e584bf540fff7b61a63af2bf69a837b5d0dd2fc269788b9283" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.596005 4867 scope.go:117] "RemoveContainer" containerID="c953381540cb09102ac0380ee3eb434cefcb71f767acb250cbef9a77ab200f2f" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.597792 4867 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m99tm"] Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.607330 4867 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m99tm"] Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.618223 4867 scope.go:117] "RemoveContainer" containerID="590a4c1837452861755281b82ab5f22cf4f7e98def6af5221cb8799e4d7441ae" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.674756 4867 scope.go:117] "RemoveContainer" containerID="461ba91cabb243e584bf540fff7b61a63af2bf69a837b5d0dd2fc269788b9283" Oct 06 14:53:15 crc kubenswrapper[4867]: E1006 14:53:15.675236 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"461ba91cabb243e584bf540fff7b61a63af2bf69a837b5d0dd2fc269788b9283\": container with ID starting with 461ba91cabb243e584bf540fff7b61a63af2bf69a837b5d0dd2fc269788b9283 not found: ID does not exist" containerID="461ba91cabb243e584bf540fff7b61a63af2bf69a837b5d0dd2fc269788b9283" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.675307 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461ba91cabb243e584bf540fff7b61a63af2bf69a837b5d0dd2fc269788b9283"} err="failed to get container status \"461ba91cabb243e584bf540fff7b61a63af2bf69a837b5d0dd2fc269788b9283\": rpc error: code = NotFound desc = could not find container \"461ba91cabb243e584bf540fff7b61a63af2bf69a837b5d0dd2fc269788b9283\": container with ID starting with 461ba91cabb243e584bf540fff7b61a63af2bf69a837b5d0dd2fc269788b9283 not found: ID does not exist" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.675338 4867 scope.go:117] "RemoveContainer" containerID="c953381540cb09102ac0380ee3eb434cefcb71f767acb250cbef9a77ab200f2f" Oct 06 14:53:15 crc kubenswrapper[4867]: E1006 14:53:15.675665 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c953381540cb09102ac0380ee3eb434cefcb71f767acb250cbef9a77ab200f2f\": container with ID starting with c953381540cb09102ac0380ee3eb434cefcb71f767acb250cbef9a77ab200f2f not found: ID does not exist" containerID="c953381540cb09102ac0380ee3eb434cefcb71f767acb250cbef9a77ab200f2f" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.675690 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c953381540cb09102ac0380ee3eb434cefcb71f767acb250cbef9a77ab200f2f"} err="failed to get container status \"c953381540cb09102ac0380ee3eb434cefcb71f767acb250cbef9a77ab200f2f\": rpc error: code = NotFound desc = could not find container \"c953381540cb09102ac0380ee3eb434cefcb71f767acb250cbef9a77ab200f2f\": container with ID starting with c953381540cb09102ac0380ee3eb434cefcb71f767acb250cbef9a77ab200f2f not found: ID does not exist" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.675705 4867 scope.go:117] "RemoveContainer" containerID="590a4c1837452861755281b82ab5f22cf4f7e98def6af5221cb8799e4d7441ae" Oct 06 14:53:15 crc kubenswrapper[4867]: E1006 14:53:15.676163 4867 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"590a4c1837452861755281b82ab5f22cf4f7e98def6af5221cb8799e4d7441ae\": container with ID starting with 590a4c1837452861755281b82ab5f22cf4f7e98def6af5221cb8799e4d7441ae not found: ID does not exist" containerID="590a4c1837452861755281b82ab5f22cf4f7e98def6af5221cb8799e4d7441ae" Oct 06 14:53:15 crc kubenswrapper[4867]: I1006 14:53:15.676186 4867 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"590a4c1837452861755281b82ab5f22cf4f7e98def6af5221cb8799e4d7441ae"} err="failed to get container status \"590a4c1837452861755281b82ab5f22cf4f7e98def6af5221cb8799e4d7441ae\": rpc error: code = NotFound desc = could not find container \"590a4c1837452861755281b82ab5f22cf4f7e98def6af5221cb8799e4d7441ae\": container with ID starting with 590a4c1837452861755281b82ab5f22cf4f7e98def6af5221cb8799e4d7441ae not found: ID does not exist" Oct 06 14:53:17 crc kubenswrapper[4867]: I1006 14:53:17.235611 4867 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a97458f-cee3-4b6e-b17a-ec43fa850edb" path="/var/lib/kubelet/pods/0a97458f-cee3-4b6e-b17a-ec43fa850edb/volumes"